All Episodes

August 6, 2024 • 60 mins

In this episode of Litigation Nation, we delve into the complex world of border security, migration, and the impact of artificial intelligence with special guest Petra Molnar, a lawyer, author, and anthropologist. Petra's extensive experience in conflict areas and militarized spaces around the world has provided her with unique insights into the intersection of technology and human rights.

The episode also delves into the regulatory landscape of artificial intelligence in border security, with a focus on the European Union's AI Act and its implications for governing AI technologies. Petra raises concerns about the loopholes in the legislation and the need for stronger regulations to protect human rights.

Read The Walls Have Eyes: Surviving Migration in the Age of Artificial Intelligence here: www.bit.ly/LNBookshop
Learn more about Petra and her work by visiting: www.PetraMolnar.com

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Jack Sanker (00:03):
Welcome litigation nation. I'm your host, Jack
Sankar, along with my co host,Danessa Watkins. Today, we're
joined by a very special guest,miss Petra Molnar, who's a
lawyer and anthropologistspecializing in migration and
human rights. Petra has beenworking in this space since the
mid 2000 as a communityorganizer. Her work, as we'll

(00:24):
get into it, has taken her toconflict areas and militarized
spaces in Jordan, Turkey,Greece, the Philippines,
Palestine and Israel, Kenya,Colombia, and on the US Mexico
border and all over Europe.
She's currently the associatedirector of the Refugee Law Lab
at York University and facultyassociate at the Merckman Klein

(00:46):
Center For Internet and Societyat Harvard University. You could
find her published in the NewYork Times, The Guardian, Al
Jazeera, the TransnationalInstitute, and all over the
Internet. She joins us today totalk about policy and law of
border security, migration,immigration, surveillance, which
she read about brilliantly inher new book, The Walls Have

(01:06):
Eyes, Surviving Migration in theAge of Artificial Intelligence.
Petra, welcome to the show.

Petra Molnar (01:13):
Thanks so much for having me. It's a real pleasure
to be here with you today.

Jack Sanker (01:18):
I have a whole section here that's kind of like
your bio bullet points of whoyou are and everything else, but
I think it would probably bebest if you just summarized, who
you are, what your backgroundis, and why it is that you're
writing on this topic.

Petra Molnar (01:32):
Yeah. Sure. I'd be happy to. So like like you said,
I'm a lawyer and ananthropologist by training, but
I'm not a technologist. Andabout 6 years ago, I fell into
this area of trying tounderstand how new technologies
are impacting the way thatpeople are moving and crossing
borders.
And it's taken me on a bit of awild ride all around the world

(01:54):
and culminated in thepublication of my first book
called The Walls of ICE,Surviving Migration in the Age
of Artificial Intelligence,which tries to tell this global
story of how AI and surveillanceand drones and all sorts of
other experimental projects areplaying out across the world's
borders.

Danessa Watkins (02:11):
Did you get into the technology aspect of it
just because of your personalexperiences and and seeing, you
know, what was out there andwhat was being used and maybe
things that we mainstream don'thear about?

Petra Molnar (02:23):
Yeah. Kind of. You know? So I, am a practicing
lawyer, although I don't taketoo many cases anymore. But back
then, I was, you know, doingmore, I guess, you could call it
kind of traditional issues, youknow, things like gender based
violence, you know, other typesof issues like immigration
detention, but not technology.
And it was really as a result ofan event that we held at work,

(02:47):
kind of on technology and power,trying to understand how, new
projects that are kind of beingintroduced are exacerbating
racism, discrimination, and justterrible experiences that people
are having. And then I startedasking questions like, what is
happening on the immigrationside? You know? But, again, I'm
I'm not a tech person bytraining. And back then, my

(03:09):
knowledge of these things wasreally limited.
We're talking like Wikipedialevel knowledge of what's an
algorithm. And I found it quiteintimidating because sometimes I
think the technology space canbe a bit difficult to to
penetrate. But I saw time andagain that these new kind of
projects and technologies werebeing introduced without really
having a conversation abouttheir human rights impact, what

(03:31):
it's doing to real people. Andfrom a legal perspective, that
was really troubling for me. Andso, yeah, it was a bit of a
happy accident, I suppose.
But it taken me, yeah, on thison this journey of of trying to
understand this interplay.

Jack Sanker (03:45):
And Danessa was asking about, you know, your,
like, quote, unquote personalexperiences, and I think it's,
like, well worth highlightinghere that, maybe unlike, some
academic work here, this bookand what you had to do to to get
this book together involvedquite a lot of, like, literal
boots on the ground, going tohotly contested, sometimes

(04:08):
militarized spaces, making youknow, I I don't know I don't
know what you wanna callyourself in this scenario, but I
I would say, investigativereporting to some extent, and
and actually kind of gettingyour hands dirty, quite
literally. And so if youwouldn't mind just kinda giving
the audience a taste for whereyou were physically, and and

(04:33):
what you kind of the spaces youwere in as you were putting
together would eventually turnout to become this book.

Petra Molnar (04:39):
This is, I think, where the kind of anthropology
hat comes in, this commitment oftrying to work from a grounded
perspective. So being present inspaces where, these technologies
are playing out. And, again, itkind of happened organically,
but as I was expanding my ownunderstanding of how new
technologies are being used atthe border, I went from Canada

(04:59):
to different parts of Europe. Iended up living in Greece for
almost 3 years because it wasone of the kind of key sites of
technological experimentation,at the border. But then also
because I've always worked froma kind of a comparative
perspective, I find comparisonto be a very, very useful tool
of analysis.
And so being able to say, okay.Well, this is happening, along

(05:21):
the fringes of Europe, but whatis happening at the US Mexico
border? What's happening in EastAfrica? What's happening in
occupied Palestinianterritories? All of these spaces
speak to one another.
But it you're right. I mean, itis kind of ultimately coming
from a commitment of beingpresent in these spaces because
there's only so much you can dofrom the ivory tower or an

(05:42):
office somewhere.

Jack Sanker (05:43):
Right.

Petra Molnar (05:43):
And I think it's perhaps also motivated by the
fact that I, you know, have myown migration story that has
motivated, I think, perhaps my,work in this area more
generally. But also thecommitment to try and tell
complex stories from a humanperspective because I really
didn't wanna write an academicbook that would be sitting on a

(06:03):
shelf somewhere that was justimpenetrable and and and really
heavily written. This is meantmore as a a story to be able to
illustrate, some of the impactsof of this technology on real
people.

Danessa Watkins (06:16):
Yeah. And that's what I I mean, I flew
through your book. So, I I thinkthat's great about The Walls
Have Eyes is you kind of startout with this personal story of
something you experienced,people you met along the way,
their struggles, you know,sharing their perspective, and
then describe the thetechnologies that are being
used, really, I don't know,unfairly, illegally, I would

(06:41):
say, even though maybe the lawsaren't in place yet, but, it it
puts a real, humanization to,what, like you said, could
otherwise just be kind of astale talk about these
technologies. But once you startmeeting these players and the
people that you came intocontact with and how it affects
their lives, It, yeah, it's apage turner, I will say that.

(07:02):
But also it it made me, feellike, wow, I my head's been
buried under the sands to notrealize, you know, how how these
things are being used.
So can't say enough good thingsabout the book. I've already
shared it with all of my family.So I actually my, mother-in-law
is in Arizona, and she had noidea about the the robot dogs

(07:24):
that are being deployed there.Can you discuss a little bit
about that?

Petra Molnar (07:28):
Yeah. For sure. I think that's that's probably one
of the more visceral so when Ifirst went to Arizona, as,
again, one of these spaces wheretechnological experimentation
intersects with the border. Ialso had no idea that it was
going to be kind of one of theseflashpoints for, again, these
really high risk technologiesthat are essentially something

(07:49):
that you might see on, you know,a sci fi episode of, like, the
Black Mirror or some other show,you know, that Literally. Yeah.
Literally. Right? And so whatwhen in Arizona, I started
working with, some search andrescue groups that are there, to
assist people who are crossingthe desert. You know, because
the Sonora Desert is such abeautiful place, but it's also a

(08:10):
very deadly deadly place, andpeople lose their lives as
they're making the journeys, totry and claim asylum. And then
you have these search and rescuegroups that are often also
sometimes made up of, you know,70 80 year olds, who instead of
relaxing on a sun lounger, aregoing to drop water in the
desert or show up for for peoplein these really kind ways.
And I, spent some time with oneparticular group called the

(08:32):
battalion search and rescue, andwe went and trekked into the
Sonora Desert to visit amemorial site of, mister
Alvarado, who was a younghusband and father who
essentially died, mayorkilometers away from a major
highway as a result of thesekind of draconian surveillance
technologies that are pushingpeople more and more into the
desert. But right around thattime, in probably one of the

(08:55):
most surreal moments of mycareer, and now that you've read
the book, I think you'veprobably seen I've had many of
such moments. It was rightaround the time that we were
literally on the sands of thesonora, the the Department of
Homeland Security announced thatthey would be augmenting their
kind of migration managementarsenal with Robodogs, which are

(09:16):
essentially a quadruped militarygrade technology that is
autonomous and that would bejoining this arsenal of of
border enforcement tech andchasing people like mister
Alvarado through the desert. Imean, this is some of the
sharpest kind of manifestationsof the technologies that that I
explore in the book, but by nomeans the the only tech that's
also used at the border.
You know? I mean, especially theUS Mexico territory is one where

(09:39):
so much technologicalexperimentation occurs, whether
it's, like, AI towers that aredotted throughout the Sonora,
different surveillanceequipment, license plate
readers. It creates thissurveillance dragnet that people
are increasingly caught in.

Jack Sanker (09:53):
The term that that we're using here, which I find
so interesting, but it's worthconnecting the dots on,
technological experimentation.It it's implied, and I think I
mean, it it's also true, butit's implied that, the different
policies, technologies, or orjust ways of doing things are

(10:13):
being tried out, on, theborders, not just the United
States, Mexico border, but youget into a lot of detailed
places all over the world. Butthe the term, you know,
experiment is is implying that,you know, they're gonna take
certain elements of thosethings. They either work or
don't work and then will broadlyend up being applied towards

(10:35):
domestic populations. Right?
This is, I I think it'sFoucault's Boomerang. I don't
know if you know. Yeah. It'sit's exactly that, which is, you
know, a police, policing tacticsor, technologies that will work
against, you know, a apopulation abroad will
inevitably find its way back tothe domestic population. And,

(10:58):
and you give several examples inthe book.
The the drones that are beingdeployed on the southern border
of the United States, wherethey're not being used by, like,
Department of Homeland Securityor, or whatever other agencies
are using them, they're beinglent to local law enforcement,
for example. Many, many thingslike that. So I I think it's so
interesting because if you'recoming from this you're looking

(11:20):
at this kind of, you know,border security thing, which is,
like, obviously, a verypolitical issue from whatever
perspective you wanna come at itfrom,

Petra Molnar (11:27):
I

Jack Sanker (11:27):
think it's extremely politically charged,
but it's so interesting becauseyou ought to care. You you
should care what's happeningdown there for from whatever
perspective you're approachingit from. And anyone who has this
sort of, you know, maybeviewpoint that's common on the

(11:49):
right of, like, big governmentsurveillance or tracking and all
those things. You don't have tobe conspiratorial. It's, like,
literally happening, and andthis is the, you know, the
prover account for it.
Right?

Petra Molnar (12:02):
Yeah. Exactly. Exactly. And, you know, even the
robodogs that we were justtalking about, right, a year
after they were announced at theUS Mexico border, the New York
City police department held apress conference and proudly
announced that they wanted touse robodogs on the street of
New York to quick, to, quote,keep New York safe. And one was
even painted white with blackspots on it like a Dalmatian.

(12:23):
I mean, again, it's this kind ofnormalization of tech that
starts at the border and thenbleeds over into other facets of
public life. So I think it'sit's kind of a twofold thing
here. Right? Like, I think it'sreally important to pay
attention to what happens at theborder because they are spaces
that are, you know, very opaque,very discretionary, difficult to

(12:43):
understand why certain decisionsget made, creates this kind of
perfect proving ground ortesting ground for
experimentation that maybe atfirst blush wouldn't be allowed
in other spaces. But then whathappens is over time, it becomes
normalized and then startsproliferating into into other
facets.
Whether that is, again, the robodogs on New York streets or the

(13:04):
surveillance of protesters oreven sports fans. There are now
conversations about using facialrecognition in sports stadiums,
for example, right? So again, itis this normalization of tech
that we really need to payattention to. Well, I was

Danessa Watkins (13:17):
gonna say a couple shows ago, on our
podcast, we were talking about,biometric data and, recent laws
that were passed, regardingthat, or that are being used
more frequently in the US, totry to protect people from the
collection of that type of dataor the use of it or how it's
stored. So I did think it wasinteresting, I guess, and not

(13:38):
surprising that, the UnitedNations High Commissioner
question of, well, okay, so theyhave this huge database, Now who
has access to it? I saw thatit's being shared with the US,
even if those refugees aren'tmaking it here. You mentioned

(13:58):
that it's vulnerable to hacking.I mean, data is king now.
So, yeah, once we have all ofthis information, very sensitive
information about people, whohas access to it, how is it
protected, why is it beingshared, you know, for what
purposes, because the people whowhose data is being collected,

(14:19):
they don't have any say in that.So yeah. What aside from or if
you wanna talk about Kenya, but,I'm sure that there are other
places where this is happeningand and, you know, what are the
effects of that?

Petra Molnar (14:31):
Yeah. And thanks for bringing that up, Nessa,
because that's kind of theunderlying logic sometimes, the
fact that data is the new oil.Right? And we need more and more
data to power these technologiesof surveillance and automation
and just, you know, this kind ofincursion of tech in our daily
life. Like, it cannot be donewithout data.
But then it also is important toask who is the data subject and

(14:51):
who can opt out from being partof these massive databases.
Because ultimately, you know,the conversation I think that
we're having is abouttechnology, but it's actually
about power. And it's also aboutthe opportunity to say no. I
don't want to be part of this.And when we're talking about
spaces like refugee camps, thatis where there's a huge power
differential.

(15:11):
And that's even inherent in theway that international
organizations like the UnitedNations, is operating in in a
pretty problematic way becausethey've also normalized
biometric data and irisscanning, for example, in
refugee camps in camps inJordan. Instead of people using
an identity card, they now havetheir eyes scanned, and then
they get their weekly foodrations. Like, at first blush,

(15:34):
that sounds good, because, ofcourse, you want to be able to
disperse humanitarian, supportquickly and efficiently. But how
do people actually feel aboutthat? Right?
Well, I mean, how would we feelif we went into, I don't know, a
grocery store and all of asudden it was like, well, we
have to scan your irises beforeyou can come in. The people
would be up in arms. But whenthat actually happens in a
refugee camp, those discussionsare not being had. And, you

(15:56):
know, I've spoken with peoplewho say, well, I feel really
uncomfortable about this, but ifI don't get my irises scanned, I
don't eat that week. That's notreally free and informed
consent.
Right? It really breaks downalong these lines of of just
huge power differentials. Andalso it reduces people to to
data points, rather thanactually seeing their their full
humanity. And one last thing Iwill say is, and I already

(16:19):
alluded to it, is the vast kindof human rights impact of what
happens when there are databreaches or when really
sensitive information isinappropriately stored or
shared. And this has alreadyhappened.
The UN collected a bunch ofinformation on Rohingya refugees
who were escaping Myanmar, endedup in Bangladesh, have been
living there for many years now.They collected all this

(16:41):
information and theninadvertently shared it with
Myanmar, the government that therefugees were fleeing from. I
mean, that's a huge breach.Right? Huge.
Yeah. And yet, how can thishappen? We really need a lot
more conversations about justwhy is even this data being
collected in the first place,and why is it being shared and
stored in these inappropriateways.

Jack Sanker (17:02):
I find so many of the the kind of, like I I don't
wanna, you know, take a strongpolitical position on this, but
so many of, like, the talkingpoints from either side you
wanna come at here. And the kindof conspiratorial maybe, you
know, tinfoil hat stuff thatpeople often will talk about and

(17:23):
you kind of brush off, until butno one's actually, like,
pointing the finger at the rightthing when they kinda say this.
And so the, 2017, extremevetting initiative that you've
discussed in the book from from,from which ICE, I I believe, is
still using, or I don't know ifthey it's been halted or not.

(17:43):
But which involves assessment ofsomeone's, you know, personal
social media, like, riskprofiles that's based on, as I
understand it, a completelyopaque algorithm for which we
have no idea what or very littleidea what the inputs are. A lot
of that information is isclassified as something you
mentioned in the book.

(18:05):
Travel records, of course. Andthe the goal of this, you know,
giant Rube Goldberg machine ofsocial assessment of someone is,
you know, to determine whetherthat person would, quote, be
positively contributing would bea positively contributing member
of society and predict whetherthey intend to commit criminal

(18:26):
terrorist acts after enteringthe community or entering the
country, unquote. And that is tome, I mean, that is so, you
know, minority report. That is,you know, for people that bang
their fists on the table about,like, you know, the Chinese
social credit score or whatever.That that's kind of exactly what

(18:47):
this is here.
And for so it's it's sointeresting because folks that
may otherwise not be inclined tobe sympathetic towards towards
the the things that arehappening to migrants at the
border who, you know, haven'tasked for it and don't deserve
it, ought to care, right,because of the way that, you

(19:07):
know, these are the biggovernment surveillance and,
things that, you know, they theythey say that they're worried
about. It's already happening.It's just happening to a
population that no one caresabout. Yeah.

Petra Molnar (19:21):
And that's exactly it. And it doesn't just stay
there. Right? Like, just becausethings are tested out at the
border, then they becomenormalized, and then, you know,
it opens up the the ambit much,much more broadly. And I think
the examples that that you'rebringing up, Jack, are really
quite disturbing.
Right? Because they are forwardlooking. It is really like a
minority report situation where,like, really, are we comfortable

(19:42):
with governments makingpredictions about our behavior
in the future and based on what?Right? Oftentimes, we don't even
know what the parameters are.
I mean, the extreme vettinginitiative is perhaps the most
extreme example of this, butother jurisdictions have done
the same. New Zealand, forexample, introduced, kind of a
pilot project where they wantedto root out, quote, unquote,
likely troublemakers. What doesthat even mean? Right? What

(20:04):
grounds will that break downupon?
Right? I mean, we can allimagine how incredibly racist
and discriminatory this can be.Right? And I think, again, as we
are seeing a shift to the right,all across the world and and
having these really heightenedconversations about migration,
about security, about theenvironment, right, and the fact
that many people are going to bemigrating as a result of

(20:25):
environmental degradation. If wenormalize these kind of
predictive technologies, chancesare that we're gonna be seeing
more and more of them.

Danessa Watkins (20:33):
Mhmm. Well, I was wondering too. I mean, we
saw in the US, particularly,like, after 911, I think more
Americans were open to the ideaof higher security and more
surveillance because we felt,you know, we had been attacked.
Okay. Now we're okay maybe withthe government monitoring more.
Did we see something like thatwith COVID 19 as well, where

(20:55):
suddenly everyone kind of feels,vulnerable? And so maybe people
are more open to, and I don'tmean obviously people at the
border, domestic, areas, arepeople more open to these
technologies that are intrusive,simply because they feel more
vulnerable?

Petra Molnar (21:14):
Yeah. Absolutely. You know, and and there's been
others who've been doing reallyamazing work kind of tracking
this vulnerability and and howit's also kind of weaponized
against people in crisis. Right?Because, of course, I mean, the
COVID 19 pandemic was and is aglobal health crisis.
Right? And so it it makes sensethat people who are all of us
experiencing this kind ofunprecedented event, that maybe,

(21:35):
you know, psychologically,there's a predisposition to
grasp at straws and say, well,you know, whatever we can get to
make the world better, we aregoing to do that. But it
actually there was a lot oflaws, not even just
surveillance, but, like, a lotof, you know, stretching of of
what normally people would notbe comfortable with. But again,
because we were in a massivecrisis, it normalized the

(21:57):
ability of the state to say,well, we're gonna incur or
create incursions on people's,you know, freedom of movement.
Right?
Mhmm. People's freedom ofexpression, data gathering
indiscriminately, and and all ofthat. And a lot of that, those
vestiges have remained with us.Right? And I think that's the
concern.
Whenever we are operating from acrisis mentality, unfortunately,

(22:17):
you know, scholars have fordecades, centuries, been kind
of, trying to raise alarms aboutthis, that crisis actually
breeds the normalization of moreand more control, right? And
unfortunately, we are seeing, atime that is informed by crisis
thinking. Right? Whether it'sthe environment or political
crises or just the widening, youknow, kind of divisions between

(22:43):
groups. I think the concern isthat, again, technology is a
lens through which to understandpower and who is around the
table when we innovate on andwhy and who gets to decide.
Like, yes, we need moresurveillance or we need more
robodogs or we need more dataversus actually thinking about,
well, how can we use technologyto improve the lives of people?

(23:04):
There's kind of that other sideto it too, but, again, it always
breaks down along the lines ofpower and privilege, especially
in times of crisis.

Jack Sanker (23:12):
Mhmm. Speaking of COVID 19, you you write about in
your book the, some of thelockdowns that were in place in,
in European countries. I mean, Ithink also applicable here. But,
in Greece, for example, the kindof lockdown procedures and
protocols that were in placewere, the government there took
advantage of those to keep,migrants, for example, in camps,

(23:36):
for much longer than they wouldhave been under normal
circumstances. And, again, thatto me is just it just echoes so
loudly of the, like,conspiratorial you know, if
you're a member in the UnitedStates, like, out they're gonna
use COVID to put us all incamps, you know, kind of crazy
talk, that you, you know,ignore.
And then here we are. Here's,like, a a, you know, quote,
unquote, western country, which,kinda did that. And, and it's

(24:02):
you know, it it kinda happenedthere. It's the, you know,
Orwellian, stuff that, you know,a certain political class
pundits in this country, youknow, make their money talking
about all the time. But becauseit happens to a group of people
that, you know, no one seems to,worry about too much, it it's

(24:22):
completely, you know, like, Ilearned this reading your book.
And you would think that folksthat were concerned about that
type of thing would have learnedthis when it happened 2 or 3
years ago. And, and it just itflies completely under the radar
just due to really the identityof those who are affected by it
the most. And it's absolutelyfascinating when you kinda think

(24:44):
about it, through that lens, atleast for me. Yeah.

Petra Molnar (24:49):
And I think there there's a lot of kind of flying
under the radar, pun intended,since we're talking about
radars.

Danessa Watkins (24:54):
Right. Yeah.

Petra Molnar (24:55):
That that happens at the border. Right? Because,
again, they are spy spaces ofopacity and also a space of
crisis. Right? And we are seeingthis around the world where
borders are increasinglybecoming militarized, and also
difficult to access for humanrights monitors like me, but
also for journalists.
Right? It it again kind of playsinto this mentality that the

(25:16):
border is this kind of free forall where anything can happen,
this kind of frontier zone ofexperimentation, which has
happened for many, many years.It's just now we have this tech
that also plays into it. Andthat's perhaps where, you know,
I remember, Denessa, you saidsomething early on in our
conversation that, you know,there's not a lot of law right
now, and that's precisely it.Because we don't have a lot of

(25:37):
guardrails around justtechnological experimentation
period, let alone at the border,it exacerbates a lot of the kind
of high risk projects that areplaying out there, largely
unknown.
Oftentimes, you know, peoplefind out about them as a result
of some, like, randominvestigation or through sheer
chance or a journalist does apiece and all of a sudden, you

(25:59):
know, a new project hits themedia. But there isn't this kind
of commitment to transparencyand public accountability when
it comes to tech developmentgenerally, I would say, but
especially at the border.

Danessa Watkins (26:11):
And I was like, I mean, in our in in normal
society, a lawyer is held, youknow, to a certain level of
prestige and you get a certainamount of respect from your
community. But did you find ingoing to these places that it
was the opposite that you arenow the enemy seen by some of
these? I would I would thinklawyers would stand right out
as, nope, stay away, get out ofhere. Anyone with a camera,

(26:32):
anyone with a law degree, anyonewith the education that would
spread the information.

Petra Molnar (26:37):
Definitely. Definitely. I mean, that was the
case in in a variety of thedifferent areas that I've worked
in where it was almost almost aliability to say, oh, I'm a
lawyer. You know, and but thenon the other hand, you know, I
think also when you are alawyer, you, of course, are
aware of of your own rights andalso your ability to to kinda do
your job depends on, you know,speaking truth to power, so to

(26:58):
speak, and and asking the thehard questions. And so,
definitely, though, I I noticeda trend where, yeah, more as as
more borders become militarizedand difficult to access, it's no
accident that, states don't wantpeople who are human rights
monitors or who are journalistskind of poking their nose in
there when, you know, a lot of,again, that kind of

(27:18):
experimentation is happeningthere without public scrutiny.
So it's a little bit of a doubleedged sword. You know? I think
it definitely helps, to be alawyer doing this work, but
sometimes it's it can make yoube seen as somebody that they
definitely don't want there.

Danessa Watkins (27:34):
Yeah. Yeah. I I had a a friend who did some work
in Guantanamo Bay, and he saidevery time he flew down there,
it was like, you know, treatednice, treated nice. He's a
gentleman, he's in a suit. Andthen as soon as he stepped off
the plane and got there, it waslike, he felt like he was one of
the inmates.
Like the guards, you know,didn't want anything to do with
him, like would put up, youknow, roadblocks every step of

(27:55):
the way, make him wait hours totalk to his clients, those sort
of things. And I I just wouldimagine you would encounter
those same sort of, yeah, Iguess, arbitrary blockades, you
know, when you get to theseplaces.

Petra Molnar (28:09):
Yeah. Definitely. And the other the other issue
that I have had, and it's notjust unique to me, but I think
to a lot of people in this spaceis trying to also engage with
the private sector that'sinvolved in this. And I know
that's not something we'vetalked about yet, but they're a
major major player in thedevelopment and deployment of
this technology. Becauseoftentimes, states just can't do
it themselves in house.
Right? Like, they need tocontract out these projects to

(28:31):
private companies. And and whenyou go to, you know, some of the
conferences where thistechnology is being sold and
kind of proposed as a solutionto these complicated, quote,
unquote, societal problems.Right? Like, oh, you're worried
about refugees?
Here's a robodog. If you buythis, this will solve your
problem. That's the kind ofthinking. As soon as they see
that you're a lawyer or anacademic, they don't wanna talk

(28:51):
to you. Right?
Because they know that you'rethere trying to get at the
parameters of what they're doingand why and all of this. And so
that was definitely somethingthat I was, you know, I wouldn't
say I was surprised by, but itwas like yet another element.
Because when you're trying tounderstand this ecosystem and
this multibillion dollar borderindustrial complex that has
grown up around the developmentof migration management tech.

(29:15):
Trying to understand how theprivate sector fits into it and,
like, following the money isvery difficult because a lot of
it isn't public either, and theydon't wanna talk about it.

Jack Sanker (29:23):
I I'm so glad you introduced me to the term, or to
the phrase, border industrialcomplex because I could just,
like, mentally, like, find andreplace, like, do, like, control
f, like, find and replace, like,military industrial complex and,
like, all of my, like,skepticism and criticisms of,
you know, like, foreignengagements or whatever it is. I

(29:45):
could just, like, take thosethings, like, cut and paste, and
now I have a useful frameworkfor thinking about, when I hear
something in the news or whenI'm just thinking about, like,
the border, that's such auseful, phrase for me to, like,
conceive of, you know,everything. Because I I think
today, in particular, I mean,today, it it and this is kind of

(30:07):
like bras broadly cross,political spectrum, like,
skepticism of, you know, foreignengagements. I mean, you know,
like the all of a sudden,there's a whole cohort of folks
that are, you know, isolationistor whatever you wanna call it.
And so you the term, you know,military industrial complex is
so ubiquitous, and it's thrownaround and people understand

(30:29):
inherently now, like, what thatmeans.
And so to think to just go, oh,like, the border industrial
complex. For me, it's just like,oh, like, that just clarifies so
much. I don't have to do anymore thinking from, like, the
ground up. Like, it's the sameprinciples all apply. And, and
I, like, I I remember readingthe book.
I was like, oh, that's just I II understand so much about this

(30:50):
now just from flipping thatswitch in my own brain. It's
very helpful.

Petra Molnar (30:55):
Oh, thanks. And, you know, I I can't take credit
for that. I mean, I would alsourge listeners to check out the
work of Todd Miller, who's anamazing journalist in Arizona.
He's been working on the borderindustrial complex stuff for
years, and I think it's a reallyhelpful framing. Yeah.
Because, again, it it centersthe kind of capitalism that, is
inherent in the way that wedevelop technology and the fact
that there's big money beingmade at the border, just like in

(31:18):
military engagements, but alsoat the border.

Jack Sanker (31:20):
Well and also in that the the investments in it
aren't necessarily, like,optimized for, you know, solving
the problem so much as they areoptimized for prolonging the
spending of money. So and it isExactly.

Petra Molnar (31:35):
Yeah. Exactly. And I think that's something that we
don't really talk about. Like, II actually think we've kind of
lost the plot on a lot of theconversations around migration
because, you know, and Iunderstand that it is a very
complex, very fraught topic thatis very divisive. But, you know,
for me as someone who's beenworking in migration on and off

(31:55):
in different ways since, youknow, 2008, All the people that
I've met and it's been 100,right?
Nobody wants to be a refugee,right? Like, the thing is people
are forced to move and they'realso exercising their
internationally protected rightto do so. The right that belongs
to me, to you, to listeners, toeveryone on the planet. Right?

(32:15):
But they are moving because ofwar, because of destabilization,
oftentimes because of westernaction, right, in in places
around the world, environment,all sorts of reasons.
So I think we actually need toflip the conversation and say,
well, how do we address the rootcauses of forced migration?
Like, how can we supportcommunities locally and spend
even a fraction of this moneythat is being used in the border

(32:38):
industrial complex. Like, Ithink it's projected to be
around $70,000,000,000 in thenext couple years. I mean, even
a fraction of that would go sofar to try and, again, deal with
the root causes of displacement.That's where we need to start.
Because, again, you know, Ithink we've we've painted people
on the move as threats, ascriminals, as people who are
taking advantage of the systemsomehow versus our looking at

(33:02):
them as human beings and andsomething you know, migration
can happen to all of us. Right?

Jack Sanker (33:07):
Mhmm. Yeah. And, you brought up the, sort of
international rights of movementand these things. And I you
know, this is a kind of a lawyerpodcast whose whose audience is
probably largely comprised of aa lot of other lawyers, things
like that. 2 concepts that I Ithought may be worth mentioning,

(33:34):
nonrefoundment, and then thereis the, the 1951 refugee
convention.
To me, seemed to be, like,through the underpinnings of
this, idea of internationallyprotected, rights of movement.
And we're not I'll I'll speakfor Vanessa here. We're not the
type of attorneys that have todeal with this. So could you

(33:57):
could you tell us what thosethings are and, and why they're
important and to who those thoserights or obligations apply to?

Petra Molnar (34:05):
Yeah. Sure. So I'll take off my anthropology
hat and put my lawyer hat on.I'm a reluctant lawyer, but
still a lawyer. Yeah.
Yeah. Absolutely. I mean, again,I think this is one of the
foundational underpinnings ofthese conversations that we
sometimes forget about because,you know, it's it's, again, it's
a right that is internationallyprotected and available to all

(34:26):
of us. The, the right to leaveour country of origin if we face
persecution, if we face danger,and to seek, support somewhere
else. And so the 1951 refugeeconvention, which was drafted
after World War 2, is thefoundational international legal
document that stipulates that,you know, there's a legal test
that you have to meet.

(34:46):
There are 5 categories on whichyou can claim refugee
protection, but you have theability to leave your country
and seek protection elsewhere.The nonrefourmal principle is a
norm under international lawthat states that you are not
allowed to remove someone fromyour country if they will face
persecution in the place thatyou're removing them to. Right?

(35:07):
And this is something that,again, I think is under threat a
lot these days when we talkabout practices like pushbacks
or the ways that, you know,different states, for example,
are not even allowing people tocome on their territory and
claim asylum. It's almost likesome states wanna have it both
ways.
Right? Like, they wanna say, oh,we are human rights respecting.
We have signed and ratified allthese nice pretty international

(35:29):
documents and also imported theminto our own domestic
legislation. Right? Like, in theUnited States, Canada, Europe.
But then actually when one pushcomes to shove, it's like, well,
we're going to derogate fromthem and and and actually create
this other regime where peopleare not even allowed to actually
enter, the territory when thoserights kind of kick in. But
again, there is a reallyfoundational strong framework on

(35:51):
which to pin responsibility towhen we're talking about asylum
law. This is, you know, anestablished also line of
jurisprudence that has been withus for, for decades now. But we
are seeing this weakening ofthese norms, unfortunately,
these days, and technology playsa part into that because it also
kind of disaggregates theborder, right, from its physical

(36:11):
location, not to get tootheoretical. But if you can use
technology, right, to push yourborder either vertically into
the skies through drones anddifferent types of surveillance
or horizontally through datacollection, different types of
surveillance as well, thenpeople are not even making it to
your border in your in the firstplace.
Right? You're kind of creatingthe zone of bordering that is

(36:33):
much wider and much moredifficult to cross, and that's
no accident. There is definitelya weakening of of this regime
that we've seen across all thesedifferent jurisdictions around
the world.

Jack Sanker (36:44):
Well, I think it's because now, the promises and
obligations that, many of thesecountries, you know, purport to
support are actually beingtested, maybe for at least in
large numbers for the first timein a long time. And it's kind of
a, do you actually support theseideas or not? And it's like push
comes to shove. Actually, wedon't. We actually don't support

(37:06):
them.
And, in fact, you know, one waywe can avoid, having to abide by
these treaties or laws, whetherthey're international or
domestic, and avoid incurringobligations to refugees or
migrants is to, you know, forexample, build a huge wall so
that you can't come here and andclaim them. That seems to be how

(37:27):
like, that seems to be what washappening in I mean, the the
example that jumped out to mewas your time on the, on the
border of, Greece and Turkey,and this was in the what I can't
remember what region it was, butit was a the part of Greece
that's, like, close to Istanbuland, like, Bulgaria. I don't
remember the name of it. Butwhere it's, like, you know,

(37:52):
otherwise, like, a pretty, like,agricultural and kind of, like,
I don't wanna say backwater, butcertainly not a a node of power
in Greece, but all of a suddenis getting, you know,
1,000,000,000 of dollars infundings to build, like, AI
sensor towers and, like, all ofthis, security technology, just
because it's on the border andit to keep, refugees from coming

(38:12):
into to Greece. You know, tokeep them from coming in because
the moment they set foot inside,they, you know, have a claim to
certain rights and privileges,whether the the country wants to
uphold them or not.
But at least they have they havethat claim once inside. So it
seems like, you know, theinvestment is coming, you know,
to a kind of a poor backwaterpart of the country that would

(38:35):
love to have, you know,government investment in any
other sector, I'm sure. But whatthey get is, like, you know,
sensor towers and robot dogs, toto keep out, you know, even,
poor and more needy people fromcoming in.

Petra Molnar (38:54):
Yeah. Absolutely. But it but what's also
interesting, right, is, like, somany states use precisely this
logic of deterrence, of saying,well, we need more technology.
We need a wall. We needRobodogs.
We need AI sensors. Let's putthem in these spaces that are
sometimes, you know,economically disenfranchised.
Right? And we're going toprevent people from coming. But
the thing is it doesn't work.

(39:14):
Right? People will just takemore dangerous routes to avoid
detection and to avoidsurveillance. This is something
that's been documented at the USMexico border, where since the
introduction of this so calledsmart wall and smart border
regime, deaths have nearlytripled. And because people
still continue coming. Right?
They're just taking morecircuitous, more dangerous

(39:35):
routes through the desert, whichleads to an increased loss of
life. But walls don't work,right? When people are
desperate, And also again, whenthey're, when they're exercising
their internationally protectedrights to asylum, they're going
to find a way. And if I can justshare an anecdote very recently.
So when my book came out in May,I went to Arizona to share it

(39:56):
with some of the search andrescue groups there.
And we had a really nice time inTucson and some of the
neighboring communities. But,you know, because of my
ethnographer hat, I was like,what's happening at the border?
Let's drive down and see some ofthis new smart wall stuff that's
been introduced and, like,what's changed since my last
visit there about a year ago?And so I went down there with
journalist Todd Miller, and wewere just kind of, you know,

(40:16):
standing there shooting theshit, like, talking about what's
happening. And we were in aNogales, which is a border town
that's bisected by the wall.
So you have Nogales in Mexicoand Nogales in Arizona. And we
were right by the wall, whichyou can go to. You can see it.
You know? There is sometimes,customs and border protection
trucks kind of rumbling by.
There's a fixed, integrated kindof, surveillance tower there.

(40:39):
You know? There's now a newsmart track, which I guess is
gonna have all sorts of sensorsthere, whatever. And it was a
really, really hot day. Andwe're kinda standing there
sweating, really dusty, youknow, and looking over into this
area where the tower stands.
And all of a sudden, I noticemovement out of the corner of my
eye. And we literally saw ayoung man scale the wall, jump

(40:59):
down in front of us into thesand, right underneath the
surveillance tower, shakehimself off, and run off into
the neighborhood. Right? And tome, that is such an example of
just, like, how thesesurveillance techniques don't
work. Right?
Because people will find a way.So instead of spending and
wasting all this money onsurveillance tech, like, let's
think about what else we couldbe doing to improve the asylum

(41:21):
system, to give people access tolawyers, to support them
psychosocially, again, toaddress root causes of migration
in the first place rather thaninvesting in surveillance
technology that doesn't evenwork. Then it also makes me
think though, like, is it aboutthe performance of surveillance?
Right?

Jack Sanker (41:38):
Right.

Petra Molnar (41:38):
Rather than it actually working, it's about the
state making us feel like theyhave it under control and
they're, like, introducing allthis tech and then and now it's
not actually working in thefirst place, but maybe that's
not the point. Maybe it's justthis theater of the border that
we have to pay attention to.

Jack Sanker (41:53):
I mean, it's manifestly not working. And and
if if your if your metric is,you know, keep keeping, let's
say, people out, well, anytimeyou turn on the news or look on
TV, you see you see peoplescreaming about how many people
are coming in. So it's, like,objectively and manifestively
like, not manifestly not workingif that's the case. No matter

(42:14):
how much money is being spent,there it's it's still happening.
So to your point, perhaps,perhaps this technology and
everything else is really not aworthwhile investment, even if
that is your goal, which I'm notsaying it should be.
But if it is, it's not working.There's they're, like, still
coming as you mentioned.

Danessa Watkins (42:30):
Obviously, not surprisingly. AI has been all
over the news for a bunch ofdifferent reasons, but it seems
like finally, the EuropeanUnion, is paying attention to
the dangers of AI. I think on aglobal level, we are. People
are, you know, scientists arecoming forward and saying, you
know, here's what can happen. Weneed to pay attention to this

(42:52):
and how it's being used.
But specific to your line ofwork, you've gone to some of
these conferences. I know someof the big ones before the UN.
What are you seeing with regardsto where we're at in trying to
regulate artificial intelligencewhen it comes to the borders?

Petra Molnar (43:11):
Yeah. It's it's a really interesting moment when
it comes to the kind ofgovernance and regulatory
conversations that are beinghad, and and, unfortunately,
it's been a bit disappointing,in the last little while. I
wanna point specifically to theEuropean Union's AI Act, which
was, you know, something thatwas in the works, this big
omnibus piece of lawmaking totry and regulate AI. And it took

(43:31):
many years to come to its finalform, right, which which
happened, earlier this year.And, you know, there was a group
of us in full disclosure, somelawyers, some civil societies,
some academics, trying to getmembers of the European
Parliament to think about, thehuman rights impacts of border
technologies and call for someamendments to the act.
For example, saying you need toban predictive analytics that

(43:53):
are used for borderinterdictions or pushbacks.
Right? Like we were talkingabout with the principle,
something that's actuallyblatantly legal under
international law or usingindividualized risk assessments
using AI at the border, thingslike that. But again, given the
fact that, migration is a verypolitically charged issue, a lot
of these amendments just didn'tpass. And so now when you

(44:16):
actually look at the text of theAI Act, at first blush, it might
seem like, well, you know,what's the problem?
There's a risk matrix, you know,that is used to assess different
technologies, and border techlargely falls under high risk
technologies. So isn't that agood thing? Well, except, of
course, for the legal listenersin the room, it won't be a
surprise, when I say that thereare so many ways that, again, we

(44:38):
can derogate from this kind of,risk principle that is there.
Because as soon as somethingbecomes designated, you know, a
technology for the use ofnational security, issues, then
the framework doesn't, doesn'thold anymore. Right?
And so it just allows fortechnology to be kind of pushed
into the border space and inmigration under the guise of

(45:00):
national security without theact and its kind of frameworks
applying. And not only is this,you know, a European issue.
Right? But the thing is the AIAct could have been a real, push
to call for governance and andregulation of border
technologies globally. Right?
Because it is the first regionalattempt to govern AI, and it

(45:21):
sets the precedent. And so ifthe AI act is weak, it just
there's no incentive for theUnited States, for Canada, for
Australia to regulate. Thishappened, you know, with the
general data protectionregulation, the GDPR in the EU,
right, which set a really strongprecedent. Not that it's perfect
either, but it forced otherjurisdictions to think about

(45:41):
data protection, in their own,you know, kind of behind closed
doors in their in in their ownjurisdictions. And,
unfortunately, the AI act justwill not have that same power.
And even though there are someconversations happening, you
know, in Canada and and even inthe US, You know, the recently
kind of released, executiveorder on AI that the Biden

(46:01):
administration came out with,was silent on border tech. It
didn't mention borders once.Right? So, again, it's not a
priority, and I think it'sactually deliberate. Right?
Because leaving borders as atesting ground is useful for the
state because it allows thestate to experiment in ways that
just won't be able to to do, youknow, when it comes to other

(46:22):
other spaces of public life, forexample.

Jack Sanker (46:24):
Would be broadly unconstitutional and, violating
a lot, yeah, a lot of rules andlaws. Yes. It's Well,

Petra Molnar (46:32):
and like you said,

Danessa Watkins (46:33):
not just state, but, these private companies,
like you mentioned in, the wallshave eyes I forget what company
it was, but setting up inIsrael, where, you know, they've
made 1,000,000,000 of dollars,testing different technologies
out on the Palestinianpopulation, and then, sell that
technology to, you know, otherother states to use however they

(46:54):
they need to. Some of the justto because I know we say AI and
it just sounds, you know, whatis AI? But I just wanted to give
a couple of examples that youbrought up, which I think we as
lay people can understand theissues with. One thing you
addressed was the the voiceprinting of, cross matching

(47:18):
somebody's voice with a databaseof accents. And you raised the
issue of, you know what,dialects can change under
different circumstances.
And that one just hit me becauseI just came back from my native
state of New Hampshire. And whenwe were there, my husband was
like, you're suddenly droppingyour r's, and you're speaking in

(47:38):
a totally different way whenyou're with your family. But
it's true. The circumstances,the subject matter, you know,
have you seen some of that, playout, I guess?

Petra Molnar (47:49):
Absolutely. And, you know, thanks thanks for
bringing that up because that issuch a visceral example that I
think a lot of people canidentify with. For me too, you
know, English is not my firstlanguage, and I find that, you
know, when I'm tired, forexample, my accent comes up
more. Or if I'm in differentspaces, we code switch, right,
as well. And so our accentschange, based on who we speak
with.
You know? And it's just it'ssuch a shortsighted way of

(48:11):
trying to understand just humanbeings. Right? I mean, this
voice printing program was usedby Germany and its asylum
system, for example, and itwould have somebody speak, you
know, into this program, andthen it would be assessed and
the program would say, oh, well,this person is likely from Daraa
province in the south of Syriabecause of their accent and not
Aleppo, a big city. But theysaid that they're from Aleppo,

(48:32):
and therefore, they must belying, and therefore we need to
assess their credibility in adifferent way.
Right? But I think this is theproblem with with AI. It's like
human beings struggle with thecomplexities of of just human
behavior. Right? It also makesme think of, these other
projects that I find reallydisturbing, these pilot projects
that basically created a socalled AI lie detector to be

(48:56):
tested out at the border.
And this AI lie detector, woulduse facial recognition or
microexpression analysis to beable to make a determination
about whether or not someone'stelling the truth when they're
avatar. But you know, again, youknow, I used to practice refugee
law and, right, if there arelisteners in in, who are
listening in, this might befamiliar. Like, I've had people

(49:18):
that I was representing, maybeact in a way that a judge was
having trouble understanding orthey weren't making eye contact
with a judge of the oppositegender because of religion,
because of their experiences,maybe because they were nervous.
Right? Or what about the impactof trauma on memory and the fact
that we don't tell stories in alinear way anyway, let alone

(49:38):
some of the most difficultthings that we have to talk
about?
Like, human decision makersstruggle with this and make very
problematic assumptions aboutcredibility, plausibility,
truthfulness. That is thestarting point of any refugee
assessment. Right? And so ifhuman decision makers do that,
what will partially or a fullyautomated system be able to do?

(49:58):
One that's predicated on a verybiased and discriminatory world
that we live in already.
Like, to me, it's just sotroubling that we are seeing
these AI technologies, whetherit's voice printing or AI lie
detection or anything inbetween, kind of forced into the
migration space without evenlike, have did they talk to a
refugee lawyer or a refugeethemselves, right, before

(50:19):
piloting some of this? Like, tome, this is really disturbing.

Jack Sanker (50:22):
And we can't we can't even, you know, post hoc
go and look at, what thoseinputs are, because in a lot of
those cases, as you talk aboutin the book, it's classified.
It's it's secretive. It's andthere's no way for anyone to
look into it and decide whetherthis stuff even works. And, I
mean, all all of these areoperating under the assumption
that, when applied correctly,this algorithmic decision making

(50:44):
process will be accurate, and Ithink even that is extremely
presumptuous. I I I tend to beand sometimes the show tends to
be a little, bearish on AI andtech and things of that matter.
And this is the assumption thatthis stuff, like, could even
work, is, I think, a one thatrequires a lot of, faith. And I

(51:11):
I don't know that it's, I mean,even even setting aside, like,
the the the biases that aregoing to be inherent, you know,
coming from someone who's thebiases gonna be that are gonna
be inherent in setting up thesealgorithms and these processes
and policies. I just don't knowthat they're effective enough
for anyone to rely on them, andon these, you know, important

(51:36):
decisions that are being made onthe border in in our immigration
process.

Petra Molnar (51:41):
Yeah. For sure. I mean, I think I think that has
to be the starting point. Right?Like, does this stuff even work?
Some of it is snake oil. Right?Like like, the AI law detector
was debunked to something that'sjust not even working. But
that's the disturbing part.Like, that it's just kind of
thought of, well, this is a goodidea try out at the border
again, because it's opaquespace, it's a discretionary
space.
There's not a lot ofaccountability or oversight or
anything like that. And so Ithink it's, yeah, it's actually

(52:03):
about these bigger questions.Like what kind of world are we
building and are we okay withthese technologies in the first
place when we know that a lot ofthem don't even work? Not to
then mention, it's also aboutthe kind of direction of travel,
so to speak. Right?
Because they're always aimed atmarginalized communities or
people who don't have access tothe same level of rights or even
legal representation sometimes.Like, we could be using AI to

(52:27):
audit immigration decisionmaking, right, or to root out
racist border guards. Like,that's a choice. Right? But
instead, AI is positionedtowards people on the move or
refugees or people who, youknow, oftentimes, again,
sometimes don't even know thatAI and automated decision making
is being applied in their case.
This happened in that AI andautomated decision making is
being applied in their case.This happened in Canada where
lawyers after the fact, whenthey got their hands on some

(52:48):
evidence of refused, forexample, visa applications, saw
the decisions, and they werelike, hold on. A human did not
write this. Right? But if,again, English is your 2nd or
3rd or 4th language and youdon't have a lawyer and you have
no idea, like, you're not gonnaassume that AI is making the
decisions.
Right?

Jack Sanker (53:04):
Right.

Petra Molnar (53:04):
It's just the whole thing is is really, really
troubling because so much of ithappens behind closed doors and
and even lawyers sometimes don'tknow what's going on.

Jack Sanker (53:12):
That's such a good point. If it was so reliable and
so trustworthy, then then itthen you could then you would
see it applied to, to audit,judicial decisions. And people
that actually, like, have anopportunity to push back against
it, you would see it appliedagainst them, but you don't. You
only see it applied againstfolks that have, you know, no
ability to challenge it.

Petra Molnar (53:33):
Yeah. That's right. And it's not just even in
the immigration space. Right?But we're seeing similar
incursions in criminal justicewith predictive policing, right,
and sentencing algorithms,welfare algorithms, child
protection algorithms.
It's not an accident, right,that it happens kind of in
marginalized spaces rather thanin spaces of power.

Jack Sanker (53:52):
Right. You're not seeing it against, like you're
not auditing, like, prosecutorsor judges or things of that
sort. Yeah. Just the folks thatcan't defend themselves.

Petra Molnar (54:00):
Yeah. Yeah. And on one hand, maybe we should be
calling for that. Right? Butthen it also then normalizes
tech in those spaces too, whichmaybe isn't good thing either.

Danessa Watkins (54:07):
But those are the people that would probably,
you know, have platforms to tospeak out against it. I could
just imagine the, you know, theboardroom of the people that are
collecting the data on this. AndI'm sure the assumption is, oh,
wait, you know, 60% of thesepeople were found lying. That
means it's working. You know,like they're that's how they're
doing the data.
Like the goal is to keep peopleout. So if it's doing that, then

(54:29):
it's working. Yeah. It's scary.

Petra Molnar (54:32):
Well, that's exactly it. I think, you know,
what you just said is reallykey. It's it's about paying
attention to the priorities andwhose priorities actually set
the stage. Right? And if thepriority is exclusion and
keeping people out and finding atechnical solution to a complex
societal problem, then there yougo.
This is what kind of drives,again, this border industrial
complex. Like it's not anaccident that we're seeing

(54:55):
Robodogs and AI lie detectorsand drones because they're seen
as a solution to a problem thathas been created by powerful
actors who say that people onthe move are the problem.

Danessa Watkins (55:06):
Right. Well and I I do love the way that you
wrap up your book, Petra, byjust, I just wanna quote one
part, from one of your lastchapters, people on the move
often have to prove that theyare somehow worthy of our
respect. That line stuck withme. And then you end with this,
story from Zaid Ibrahim. And hesays, what was our crime that we

(55:32):
fled from Syria?
You know, it's like we, we kindof put refugees in this other
category, but it's so, thischanged my whole mindset of no,
that could be me. And, you know,what is my crime? Fleeing
somewhere where my family canget shot in the streets, or we
don't have food, or we don'thave access to medicine? It's,
you know, we need to and I thinkyour book does a good job of

(55:53):
this of humanizing. These arepeople, and, you know, what's
being used against them is notright, and it wouldn't be
allowed in, you know, our ourday to day lives.
So why are we allowing it inthese spaces?

Petra Molnar (56:06):
Yeah. And, you know, that that was really my
whole goal with, with this bookand indeed with, with the whole
corpus of my work is to, tohumanize the situation and and
to kind of try and, at least, insome small way, fight against
this kind of dehumanizationnarrative that that presents
people on the move as threats,as as just someone out there

(56:27):
that that, you know, peoplecannot relate to. But what you
just said, I think is reallykey. I mean, this can happen to
anybody, and it has in facthistorically. Right?
Conflict happen and and peoplehave to flee. And again, people
don't choose to be refugees andwe would do anything to save our
children or our families, right?If you know that, you have to

(56:48):
make some desperate decisionsand either cross But I think we
can't lose sight of that, thatthere are real human stories at
the center of all of this, andand I think that really is
hopefully the takeaway, of thebook.

Jack Sanker (57:07):
You describe yourself as a reluctant lawyer,
but, one way that we can tellyou're an authentic lawyer is
the the PDF of the book that yousent, is, dot final, which is,
like, the classic, like likelike, the, like, 10 drafts
later, like, you know, this isdot final, like, all caps.

Petra Molnar (57:26):
You know, I just wanna thank you both for, like,
really engaging with the book.Not everybody does that, and and
I'm really grateful for that.

Jack Sanker (57:33):
Well, thanks for writing a good book. Not
everyone just

Petra Molnar (57:36):
had it. I thought it was pretty nice. Some

Jack Sanker (57:38):
people write really boring books and then we're
stuck.

Petra Molnar (57:43):
Well, you know, I will say, I really like, from
the get go, I knew I this wasnot going to be an academic
book. Like, yeah, it's okay.It's undergirded by years of
research and analysis orwhatever, but, like, I wanted it
to be something that everybodycould pick up. And I have to say
that the biggest, I think, mostproudest moment of my career is
when, we found out that it wasavailable at Target.

Danessa Watkins (58:03):
You know? Like, to me,

Petra Molnar (58:04):
I'm like, this is great. Well

Danessa Watkins (58:08):
So for our listeners, where, where can they
find you on socials? Where canthey find your book? Give some
information because I hope thatour listeners will pick it up.

Petra Molnar (58:20):
Sure. Yeah. I mean, if you're interested in
finding out more about the workor about the book in in general,
I have a website, just my name,petramullnar.com. I'm also on
Twitter, and the book isavailable anywhere the books are
sold, whether that is a smalllocal bookshop, hopefully, or
Amazon, or Target. It'savailable.

Jack Sanker (58:39):
Thank you so much. I mean, first of all, thank you
for writing the book,irrespective of whether you're
gonna be on the show or not.It's a tremendous work that you
spent a ton of time and effortand put yourself in harm's way
quite a few times. It's, anexcellent read. Highly recommend
to anyone who's looking to learnabout this topic, but also to be

(58:59):
entertained.
It it's it's got a reallyfantastic narrative as well.
Thank you for spending the last,you know, hour and 10 minutes
talking with us about this. And,for those of you that wanna
learn more, check out Petra'swebsite, Twitter account, buy
the book.

Danessa Watkins (59:17):
Yeah. We can't wait to to see when you're
quoted, you know, at theinternational level for the
reason why they start looking atthese policies more seriously
and changing things. So weappreciate your work and and all
your time.

Petra Molnar (59:30):
Thank you so much for having me.

Danessa Watkins (59:32):
Alright. Well, that's our show. We wanna say a
huge thank you again to PetraMolner for joining us. Such an
interesting topic and certainlyone that I think everyone needs
to pay attention to, given thatthese technologies are now
turning mainstream. So hope youpick up the book and enjoy it as
much as we did.
Again, we will be back every 2weeks. You can listen to

(59:55):
Litigation Nation wherever youget your podcast, and we will
provide links to access, theWallace Hub I's as well. We'll
see you next time. Thanks.
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Las Culturistas with Matt Rogers and Bowen Yang

Las Culturistas with Matt Rogers and Bowen Yang

Ding dong! Join your culture consultants, Matt Rogers and Bowen Yang, on an unforgettable journey into the beating heart of CULTURE. Alongside sizzling special guests, they GET INTO the hottest pop-culture moments of the day and the formative cultural experiences that turned them into Culturistas. Produced by the Big Money Players Network and iHeartRadio.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.