All Episodes

July 27, 2020 73 mins

Emily returns with tales of a haunted hotel she stayed at in Mobile, Alabama. Then it's the #babywitches who hexed the moon, followed by a listener email about the queer Sapphic aesthetics of cottagecore. This of course leads into some discussion of Taylor Swift's new album Folklore, and how in a way it's always fall right now. We read a listener email about the struggles of international students right now, and another about working in a plastic surgery clinic during the pandemic. Then the girls are joined by Meredith Whittaker - Research Professor and Co-founder of the AI Now Institute at NYU. After working at Google for over thirteen years Meredith helped lead the Google Walkouts to end AI contracts with the Department Of Defense. We talk to Meredith about racism in tech, the biased badness of algorithms, and the human beings behind most "artificial intelligence." And, of course, Robocop. How can we organize to prevent tech companies from say, building war machines for the government? Plus one hot but easy tip for avoiding facial recognition software at protests. All this on the all new Night Call!


Footnotes:

  1. Haunted Malaga Inn 
  2. Moon hex explained 
  3. Marianne Williamson moon hex tweet
  4. Why are people angry at witches on tik tok? 
  5. Paper Mag on cottagecore
  6. Correction: Sean Penn's COVID test is CORE, not Curative
  7. Meredith and the Google walkout 
  8. Facial recognition/surveillance in schools
  9. Meredith testifying before Congress
  10. The AI Now Institute 
  11. Meredith Whittaker on Twitter 
  12. Tech Workers Coalition 

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
It's nine eighteen pm at the Malaga Inn and you're
listening to Night Call. Hello, and welcome to night Call,
a call in show for artist Topy in Reality. My
name is Emily Oshida, and with me on the other

(00:23):
line are Molly Lambert and Tess Lynch. Also stay tuned
because later in the show we have Meredith Whittaker, the
research professor and co founder of the AI Now Institute
at n y u UM. She was also involved in
the Google walkouts. She's awesome, so please stick around to
hear her. Come by later. Hi, guys, but first I'm back.

(00:43):
I'm back from my truck across the country. Thank you
both for holding down the fort while I communed with
the Highway after but here after Highway Uh yeah, I
did a six day trip across the country and it
wouldn't have been complete if I didn't stay at a

(01:05):
ghost hotel my last night, uh in Mobile, Alabama. So
I actually, I mean I wish that I had a
more colorful story to share with you, because you guys
are like call us from the hotel, like, like, you know,
tell us what it's like. It was like it was
actually just like nice, It was like a nice kind
of very Southern style old hotel that was surprisingly affordable,

(01:31):
which is why I went there. I was like, this,
there's got to be a scam. Like, I'm that's the
benefit of a ghost I know. I was like, I
was like, what's the catch because it's so nice. It
was like one of the cleanest places I stayed, which
was a big concern over my trip. Um. But I also,
you know, it's on the It's on all these lists
of haunted hotels in UM in the South and this once. Yeah,

(01:54):
it's called the Malaga in it's in Mobile, Alabama. I
you know, I booked it, and then I went, I
should have done this in the other order. I booked it,
and then I went looking for ghost stories about it. Um.
But there was one I sent you guys that I
just want to bring up again because it actually did
kind of freak me out. UM. But this is on

(02:16):
I wish I remember what sight this was on. But
somebody said, my sister and I stayed at the Malaga
in in summer. Uh. They had a great dinner and
spent their time out on the balcony and joying the evening.
But in the middle of the night, I was awakened
by my sister screaming bloody murder because someone had crawled
into bed with her, and even worse what she realized
there was no one there. And the next day they

(02:39):
asked the friend dusk about it, and apparently the room
they were in was the one with the activity, which
is apparently room one oh seven at this hotel. So
I stayed in room one oh three. I was two
doors down from there, but I was in a historic room.
There were parts of the hotel that we're part of
the original structure, and then ones that came in later.
So I made sure to get one that was like

(02:59):
in in the good and haunted zone. Uh. There's other
stuff that I read about it that said like it
was a there's creaking and activity and apparently because it
was used as a hideout for Confederate soldiers, um, which
I was like, it was Mobile, Alabama. Where why are
they hiding out? But I I don't I guess I
don't know what how that worked. As with all things

(03:21):
in the South, is incredibly haunted, as with all things
in America, it was incredibly hunted. But um, but nothing
really weird happened while I was there. As I said,
it was pretty chill and just sort of nice to
be somewhere that was like not a chain hotel. It
was just like pleasant. But a weird thing did happen
when I there was like a portrait of like a

(03:42):
Southern bell on the wall next to my bed, and
I was just like lying in bed looking at my
phone and then I just like aimlessly sort of put
my phone up and tried to take a picture of it,
and my phone would not take a picture of it.
I tried and tried, and it did this weird glitchy
thing that my phone has never done before, where like
the image on it seemed to slow down. It wasn't like, um,

(04:04):
it wasn't in sync with like how I was moving
my phone around. It was sort of like there was
a big delay and it was like in slow mo.
And then I just turned it and I pointed it
at another part of the room and it took a
picture just fine. Then I went back to take a
picture of the portrait and it couldn't take a picture
of the portrait, so, um, I have like a blurry,
dark photo that it tried to take but it would
not get a proper photo. So that's as weird as

(04:28):
it got. That's pretty weird. That's pretty weird. Yeah, but
I wanted to I wanted a real encounter. I want,
I want to believe. But I guess next time, whenever
I make my trip back, I'll try to hit all
the haunts haunted portrait good enough, Yeah, I'll take it. Um.
Speaking of haunted things and and the cousin of haunted

(04:51):
things witchy things, it's been a while since we've had
a Moon minute. Minute. I think it's time for a
Moon minute because baby witches were trying to hex them moon. Yeah, baby,
which is from Witchtok, which is the witch part of TikTok. Yeah,
everything comes from TikTok. Now, apparently this is just gonna

(05:12):
pivot to be a hundred percent TikTok podcast because they're
giving us lots of material recently. Yeah, I just thought.
I also, like, I was not online a lot, but
I just I did see a lot of stuff. Every
time I would go on Twitter, people just being like
what's up with the moon? And I didn't know what

(05:32):
people were talking about most of the time. But then
and then I caught onto this thing that was going
on with with the baby witches, where apparently it became
a thing on TikTok to hex the moon. Um. They
first tried to hex the fae, which are like, uh
like kind of fairies, like the fair folk Celtics specific

(05:56):
fairy folks. Yes, yeah, and then and then they decided
to try the moon. Uh. And it's not clear to
me why other than for lulls um, which just feels
like for the lulls feels like a very chaotic spirit
to go into being a baby, which but I guess

(06:18):
that's the nature of being a baby, which um. But
people were getting really mad and there was a rumor
floating around Twitter, which I have no idea if it's
true or not, that a twelve year old girl was
was killed after trying to hex the moon. I hope
this feels very creepy pasta ish, but I don't so

(06:38):
I have no I mean, if that's true, it's horrible.
I also don't know if the moon is responsible, but
either way, bad bad news. But um, but still like this,
this kind of urban legends sort of grew up around
these moon hexes that are going on. Well, the reasons
why you shouldn't hex the moon are pretty straightforward. I guess,

(07:02):
um it's because any well, anytime you do evil magic,
it's going to come back to you and uh come
back to you times three, they say. But hacking the
moon also angers the gods that rule the moon, who
are Apollo and Artemis. I think, um, Artemis rules the moon,
Apollo rules the Sun, and Apollo is twins with Artemis.

(07:25):
That's right, so he gets mad if you do anything
to insult his sister. Yeah. But also so now the baby,
which is we're saying they were going to curse the
sun and if you mess with Apollo, apparently Apollo rules
you know, the arts and creativity, but also healing and medicine.

(07:45):
Uh so not a super smart thing to do if
that time. Yeah, Um, personally I would not funk with
the moon. Well the moon is I just don't. What
did the moon do to you? The whon is a
friend to all witches and other people. I'm just picturing
that part in the craft where Nancy's like just overcome

(08:06):
with how much power she has, you know, and you're like,
why I can do anything? And that's maybe how you
think you should hack the moon. I just I also
don't if they're baby, which is can they even this is?
This is the sort of cycle I get into it.
Think eat about that this, like, how are they even
being successful at hexing them? That they had come into

(08:27):
their power yet? But yeah, like I don't I don't
know how they would even be capable of doing such
a thing. I just think if there are enough of
them that the collective energy. But also Mary Anne Williamson
weighed in on this, Oh yeah, yeah, which is funny. Yeah,
she said, uh, that's got to be some really drunk

(08:48):
or stoned hashtag baby, which is if they think that
in the midst of a hashtag secret police invasion of Portland's,
the best they can do is hex the moon all
caps face palma, om gi uh, which is true. I
don't know why these again, I don't get it. Like
I remember when that like all the Bushwick, which is,

(09:08):
we're trying to hex Donald Trump at one point, like
you know, maybe in year one or two of the
of the Trump presidency, and that I understood. I actually
I feel that somehow if everybody, like enough people at
the same time or concentrating on somebody's head exploding allegedly, uh,

(09:29):
that that could happen. Maybe, Like I I do think
that maybe there's energy that are you talking about scanners magic. Yes, yes, exactly,
that's the only that's my Scannermancy, that's my that's my trade. Um.
But uh yeah. But and so that felt, you know,

(09:51):
like there was a clear political motive there. But the
moon thing is beyond me. It's the chaos of the youth. Um.
There's something else going on in the sky, which is well,
they will no longer be going on in the sky
when you hear this podcast. But there has been a
comment that Emily has been trying to see I've been guys,

(10:12):
haven't you guys haven't I haven't seen it. I haven't
been able to see it. I haven't been anywhere dark enough.
But the planets are really bright right now. Yeah. I
saw Jupiter when I was driving U. I listened to
you know that that syndicated MPR show where they just
say what planets are out, and they said that Jupiter

(10:33):
was out, So I like, I made a point to
check out Jupiter because you could. It's it's like a
very bright star. It's like a very bright, warm looking
star when it's out. But it's it's always cool when
you can see it. When I see the planets in
the moon. The last thing I want to do is
hex them. Yeah right, I wanna, I wanna get them whatever.

(10:55):
We are pro planet on this podcast. Um, speaking of TikTok,
Let's take a quick break and then we will be
back with more TikTok updates. Welcome back to Night Call.

(11:19):
We got a night email updating us on cottage Core,
which we spoke about. It turns out without really knowing
anything about it, which is something we're trying to be
better about doing. But we love when you write in
and tell us things that we didn't know. So here's
the email we got. Hi, everyone, I was a little

(11:42):
surprised by how you talked about cottage core, though I
had heard rumbles it was starting to be co opted
by fascists. Cottage core as I've encountered it on TikTok. Yes,
I'm another gen Z listener is primarily a queer, specifically
sapphic aesthetic. I think cottage cores appeal is a something
of an escapist fan of See you and your partner
can escape a homophobic, transphobic, misogynistic, capitalistic society by isolating

(12:07):
yourselves in a cottage where you raise chickens and embroidered together. UM.
Cottage core, though, can glorify history based on exploitation of
black and indigenous folks, which is something you discussed on
the episode, and I think is important to discuss in
the context of its use as a queer aesthetic, especially
since the prominent cottage core influencers are predominantly white. Thanks

(12:28):
for all the work you do. Thanks so much, listener, Katie. Yeah,
I don't. I still don't really have a grasp on
on cottage core that this really helped out a lot.
Uh And there was she also linked to uh article
in paper about about just explaining cottage core. I don't

(12:49):
remember if I said it on the pot or not,
but like I kind of hypothesize that that a lot
of Miyazaki movies are a cottage core just because of
this sort of almost did how from history fetishization of
like a kind of Bavarian slash Italian aesthetic, like a

(13:10):
very Euro old world euro aesthetic, um like reappropriating it
for his own kind of mythology and everything. I think
there's always tons of that going on in all contexts.
I think the people were kind of commenting on the
fact that it does tend to be a pretty white aesthetic,
which is what from from the little that I had

(13:31):
seen I had also witnessed as well. But it is Walden,
the original cottage Core. It's Henry David Thorreau, the first
contin Yeah, because it's like that's the thing, Like, it's
like the same thing as country music. It's like, you know,
the myth of Walden and the then the myth of

(13:52):
of actually going out into the wilderness is is kind
of like inescapable from these things. Like country music was
never really about anybody living in the country. It was
all about like an imagined as a nostalgia for a
country life, for like a rural lifestyle or something. Um So,

(14:12):
I think that that's just like it's almost inescapable. Anytime
you try to like codify a vibe that there's going
to be some mythologizing there. Um Yeah. One thing I
didn't mention last time, but there was like a like
a Japanese subculture called Mori girls that I was like,
is that what this is? That? Are that are forest girls? Totally? Yeah.

(14:36):
That was like sort of the anti gothic Lolita, like
very minimal like earth tones and likes clothing. Yeah, it's
a very familiar Internet esthetic. It's a you know, in
the in the family of the Autumn Lovers UM, and

(15:00):
it came up a lot last night because of the
new Taylor Swift album coming out. I was gonna say,
is this not cottage core? Is she not going right
into the cottage core. I'm just gonna say, tread carefully, Molly.
Every time you mentioned Taylor Swift, you enrage her fans,
and I don't want them coming for us. I I
love Taylor, don't. It's not me. I welcome to this

(15:20):
course with the Swift Eats. People are specifically like, UM,
the band Muna, who I really love, UM did a
treet where they were like, the Taylor Swift album is
for like sapphic cottage core influencers. UM, because a lot
of people are reading some kind of sapphic vibes into it,
and UM, people are like because she wrote she wrote

(15:41):
one song from the perspective of a of a male
voice where it's um, you know, talking about girls. But
people were sort of like, what's this all about? But
I think also it's like, whether that is Taylor's intent
or not. UM, it's definitely like resonating with a lot
of people that are into the queer cottage chor aesthetic. Well,

(16:03):
cottage corep also just is very timely obviously. Yeah, it
makes I mean, it's a it's almost practical at this
point that being trapped in your home but also so
tethered to technology makes you consider that the only other
option of being trapped at home that is more appealing
than being tethered to technology is being trapped at home
in kind of homesteading or like becoming involved in like

(16:26):
plant matter and like digging in the mulch, but also
looking kind of cute while you do it. Why not? Now,
That's what I think is so funny when people like,
you know, I've seen friends who have like gone and split,
you know, a cabin somewhere for a weekend. Uh, And
then there's always like a kind of sub tweetie energy

(16:49):
of like, oh you went to a cabin for the weekend.
It's like that's literally the only place you can go
right now. Like, if you want to go anywhere that's
not your home, you kind of have to go cottage
or a little bit. I also just think it's funny
because it is such a sort of like it's a
fantasy of no technology in a lot of ways. But

(17:09):
then it's on TikTok. Yeah, well, the thing's dark academia.
That's there. That's I also was thinking, like the tailor
swift aesthetic. I was like, is it light academia? Is
that her aesthetic is non dark academia. She had a
tweed coat in one of those photos, so that that
kind of edges into dark academia. Terry people were people

(17:31):
were saying, they were like, oh, this album is so autumnal,
but everybody's trapped at home, so it doesn't really matter
what season it is. You can be cozy, cozy cottage
core in the summer, because what are you going to do?
I think of cottage core is a much more a
summary vibe. Yeah, because cottage core is the like linen

(17:54):
dresses with the bit it's breathable and and dark academia
is much more layer friendly cottage cores like Paul and
in the air and you're wearing a dress and you're
like digging up root or whatever. Yeah. Um, one of
us is going to get on TikTok one of these
days and actually figure out this ship for ourselves. I

(18:15):
would be really curious as to what subculture can we
just create a nightcall TikTok and share it. Somehow, I'm
afraid to commit. We also got another email about our
our COVID reality right now from a listener who writes, Hello, Nightcall.

(18:36):
This email is two episodes too late. I'm just catching
up on the episodes, but I thought I would send
it anyway. I really appreciate when Emily talked about international
students in relation to the on campus experience. As someone
who works at a university and as an international student,
I've seen the effect COVID Nightina has had on international students.
When universities kicked out students in dorms, international students and

(18:58):
other students without a network were placed in precarious situations
because they have no family to turn to for support. Moreover,
many international students where I live are essential workers, and
new visa laws and immigration changes made their situation even worse.
For me personally, I was accepted into my dream graduate
program at n y U and was super excited to
move to New York City and to have new exciting experiences. However,

(19:22):
due to my family's experiences with US immigration and the
harassment we face, I ultimately had to reject the offer.
My experience is nothing compared to thousands of students who
might have to leave the country due to ices online
learning stipulation. Thank you so much. That letter was from Hussein. Uh,
it's great to hear from somebody who had actual experience
with this. Yeah. I think it's I mean, what is

(19:44):
I vaguely keep catching up on what's been going on
with um, with school, the entire notion of school. UM.
I know that Uh, several universities sued the Trump administration
for this policy. Uh where ice was you know, going
to potentially deport students who were on a visa. So

(20:06):
I mean, but I don't know if that goes for
everybody or just for the people who are at those
schools that are that are filing lawsuits. UM. It just
seems so again, this like piecemeal approach to everything right
now is just like so disorienting and so confusing and stressful,
especially if you are somebody in this situation where you

(20:28):
don't know if you are able to stay in the
country but you still have to like finish school. You know.
There are also so many, so many reversals every time
a decision like this is made, and I think particularly
when it comes to foreign students who are going to
who are making these huge moves. It's not not a
great position to be in if you know that any

(20:49):
decision that is made could be easily reversed at any point.
I mean, you making the commitment to make that kind
of a move, um, under such precarious circumstances as at
the impossible task. Yeah, it seems it seems like a
lot of the stuff is the Trump administration will like
declare something and then people have to spend a lot
of time being like, wait, is that even legal? Um? Yea,

(21:11):
and by the time you figure it out, it's just
it's um. It's a terrible chaotic approach to everything, and
the confusion I think is intentionally uh, definitely so uh man,
thank you so much for writing in Hussein. That's a
great perspective for us to have. We also have another
call and by the way, if you would like to
talk about your experiences as an essential worker or otherwise

(21:34):
working during the pandemic, please give us a night call
at one two four oh four six night. You can
also email us at Night Call Podcast at gmail dot com.
We are more than happy to keep you anonymous. UM.
Just let us know at the top of your letter
or or your experiences as a student, or just any
stories about COVID times and trying to get stuff done

(21:55):
under impossible circumstances. We would love to hear from you,
and we're happy to keep you anonymous. So this comes
from one of those anonymous writers they write. I work
in a clinic for a surgeon in Orange County. He
never stopped seeing patients. I got COVID the first week
of March from a patient. I was out of work
for six weeks and I'm still having breathing problems four
months later. I had a second exposure three weeks ago

(22:16):
with another patient who was positive. I ran a fever
for three days and had to have two tests to
make sure I was negative before I could return to work.
We have screening questions that were supposed to ask patients,
including if patients have traveled and if they have a fever.
But even if they have traveled or if they have
a fever, we haven't turned anyone away. I'm afraid I'm
going to get sick again, and I'm concerned that I

(22:37):
will bring it home to my family. Thank you. I
just wanted to share this is so psycho as somebody
who has been in doctor's offices more times than I
would have liked to be in the last of five
Oh my god, almost like five months now. Uh I.
The first thing I had to do was get a

(22:57):
tooth pulled, and they're why is that? You know? Over
the phone when I made the appointment, there was that
checklist like have you been out of the country? Do
you have any of these symptoms? Um? And I, you know,
being this is very early in lockdown. I the day
of I kind of had a sore throat, but it
was because of my tooth that was that was giving

(23:21):
me problems, Like that's a that's a symptom of a
of a kind of irritated wisdom tooth, as you can
have a sore throat sometimes and so that I just
started up that morning. So I went there to the
office and I like was honest about that. I was like,
I think it's just the tooth though, and they almost
didn't let me in, like they were taking it super
super seriously there at this um at this oral surgeon,

(23:42):
which I appreciated a lot, they still saw me and
like the sore throat went away as soon as the
tooth was out, So that was totally what it was.
But um, the fact that they're asking questions and then
letting people in any way. It's like, what is what
is the point. It's like people are not thinking about
what the actual point of any of these safety majors
are now. Um, and that's so upsetting, especially if you

(24:07):
don't really have an option not to distance from people
when you're a job like this, right, And it's also
like you're taking people at their word, you know. I mean, yeah,
this is is the lack of testing as well, UM,
because like Emily said, so many people are having symptoms
of anxiety or neglected problem, you know, health problems that

(24:28):
you have to deal with later. Like I had a migraine, um,
a couple of weeks ago, and my because I was
having a migraine, my heart started racing and I have
like the blood oximeter and then of course you look up,
you know, you're just feeling like you have a migraine,
You're feeling under the weather. And then I was like, oh, okay,
well this all of this could also be related to COVID.
But then it was really hard to get tested and

(24:49):
I was very lucky to be able to. But I mean,
when you interview people about problems that they're facing and
for which they are seeing a doctor. There are, of
course things like a fever that you can measure, but
then there are also just general how are you feeling questions,
And if you're seeing a doctor, chances are you're not
feeling great. So you have to be able to know
what's going on on an individual level before you see

(25:11):
a patient. Otherwise you're putting everyone at risk, and that
can really only be accomplished with a test. Yeah, I mean,
I really feel for people that are working in doctor's
offices and hospitals right now, because you need a hospital
to be that safe zone. But yeah, there's so many
factors that we don't have under control because we don't

(25:34):
have testing, and there's so much protocol that we don't
have yet either. Like I you know, I am way
overdue for just a dentist checkup at this point, and
I've been putting it off because I was like, it's
not essential, like I had to get my tooth pull
that I haven't had a dentist checkup in a while,
and now it's just like, well, when when can I
do that? It's like this, it's the same thing as

(25:55):
like a regular checkup or getting a physical or something
like that. It's like we I think when we everybody
was under this assumption that it was only going to
be for three months. It's like, okay, well I can
put off a physical for three months, who cares. But
now it's like, okay, if this is going to be
going on for a year, a year and a half,
like what what is the best practice in that case
just to not take care of yourself? Like and there's no,

(26:17):
there's no there's been no guidance around that from any
of that at all, especially here where there are things like,
for instance, I was talking with someone um and they
weren't able to get their contact lens prescription because it
was like a month, you know, past its expiration date.
In Canada, my friend said, you can if you're slightly
past your prescription date, and there are circumstances like this,
they will just send you your contact lenses. But because

(26:40):
they're classified as something that almost like is ingestible, they
go like inside your body, then they are regulated, so
it's she has to go out now to the eye doctor.
And there's things like that where it's like, you know,
for certain types of medications that require a doctor's visit,
the patient might want to not take the chance on
exposing their doctor, but then they're not going to get
them at cation that they need. So there are all

(27:01):
of these complicating factors that might make patients go see
the doctor even if they weren't able to get a test,
even if they didn't feel great. And that's such a
terrifying predicament because there's really no right way to handle it.
And I think we're also seeing now like the three
months thing. I feel like we always kind of thought
was bullshit. It was just like to buy people enough
time because they can't think beyond like three months to

(27:23):
something everyone can handle. If they had said it was
going to be a year or two, people would have
flipped out more. I definitely felt strongly that it was
going to be a year, uh, just from the get
really yeah, I did not, because I was like, they're
they're trying to be comforting, but like nothing they're saying
has any concrete plan, you know. But I didn't think

(27:46):
that even even as somebody who thought it was going
to be a year, I was still like, five months
from now, will at least have testing or something, you know,
we'll have like regulated testing. Um. And the fact that
we are five months into this and it's still so
ticky tack and doesn't seem like it's going to get better.

(28:06):
And if you get a test, there's still enough of
a delay in getting your results that it's it's so
much less useful than the rapid results tests, which I
think are still like a hundred and twenty five dollars
here um that you can get an urgent care and
are those accurate ones? Like, I think that those are
more accurate. I've heard if people who were Yeah, the

(28:27):
nose is more is more accurate, The nose swab is
more accurate. The cheek swabs, though there have been instances
of people um getting positive cheek swab test results and
they never actually swabbed their cheeks. They like got in line,
had an appointment turned around because of the weight was
too long or whatever, and then they get a caller
an email that they tested positive and they were like, no,

(28:49):
I didn't. Yeah. The l A swab testing is being
run through a nonprofit uh, created by Sean Penn. Is
that Curative Care or whatever? Yeah, So again, it just
seems like the privatized nature of all these things as
part of the problem, um that we don't just have
nationalized healthcare where everybody could get tested. That would be great,

(29:14):
That would be nice. Well, we're going to take a
break and when we come back, we're going to be
joined by our guest this week, Meredith Whittaker, to talk
about UH, surveillance, UH, labor rights, artificial intelligence so called UH,
and so much more. So stay tuned for that welcome

(29:44):
back tonight call. We are joined today by our very
special guest, Meredith Whittaker. Meredith is a research professor and
the co founder of the AI Now Institute at ny U.
Her work focuses on the social implications of artificial intelligence
and the tech end street responsible for it. Previously, she
worked at Google for thirteen years, where she founded the

(30:04):
company's Open Research group and helped lead labor organizing efforts,
including the Google walk out and efforts to end AI
contracts with the U S Department of Defense. Her work
is driven by the belief that worker power, collective action,
and strong social movements are necessary to ensure meaningful tech accountability.
Welcome Meredith, Thanks so much for being with us this week. Yeah,

(30:32):
I'm happy to be here. Um. Uh Yeah, So I
feel like we we constantly are saying among ourselves that
we wish that like we had a kind of a
an info SEC expert on call at all times. So
you are going to be our info SEC expert for
the next thirty minutes. UM, So just strap in. UM.

(30:56):
I think that like maybe a good way to kick
it off is just talking about kind of the specific
concern recently about UM about surveillance specifically at protests and
among activists. UM, particularly in Portland, there's been concerns that
I know you've been talking about on your Twitter about

(31:19):
just sort of the kind of murky waters of of
of capturing face data and other kinds of data from
these events. UM. It's it's like a weird new world
that we're all navigating, and unfortunately there's like uh proto
fascist slash fascist forces that are taking advantage of that

(31:39):
uh not quite figured out world. So UM, yeah, is
there anything particularly from that that that has been surprising
to you recently? Yeah, I mean let's yeah, let's let's
open up the whole cannon. Or yeah, I don't. I mean,
I think surprise isn't the word, right. I think we
know that the tools of centralized power social control surveillance exists,

(32:06):
and frankly have existed for some time. I think what
we're beginning to recognize is what they look like in
the hands of you know, I think proto fascist is
a bit of a diplomatic term for what we're seeing
with the you know, the the federal troops that you know,
hs as involvement, um. You know. But but you know,

(32:27):
I think I don't think I'm surprised. I think this
is it's it's extremely troubling, right and a lot of
what you know, my senses with a lot of people
who are sort of historicizing this moment looking back at like, well,
this is you know, this is how fascism looks when
you know, suddenly there is a you know, there are
kind of a domestic militia that is serving you know,
the the goals of the you know, authoritarian party on

(32:51):
the streets. Um. You know, what we haven't seen before
is fascism with the types of surveillance affordance. Is that
these folks have, right with the ability to you know,
buy location data and other intimate data from you know,
data brokers, with the ties that companies like Talents here
and others have directly to the Trump administration, with the

(33:14):
sort of you know, the way in which we have
now sort of structured are our social and political lives
around the necessity of having these networked infrastructures that are
necessarily collecting you know, data about us, data, you know,
making inferences about us. Um you know, whether or not
we call that surveillance, whether or not we call having
you know, a private company own all of our Gmail

(33:36):
going back for years and years surveillance. You know, that's
the bedrock on which all of this network technology and
the industry that has been built around it is is resting.
And we're now saying, you know what happens when the
you know, happy primary colored promises of tech you know,
actually aren't kept right. UM. So, so here we are. UM.
But you know, again this isn't that we saw this

(33:56):
in Ferguson, you know, standing Rock. A number of these
you know dynamics were at play. We felt with the
pandemic powers that were given to Ice by Trump that
they now sort of you know, this is permitted under
those powers for these um you know, for for federal
agents to be deployed in this way, for them to
use you know, drones and other surveillance capabilities to you know, track,

(34:19):
and what we're seeing is sort of you know, harass harm. Yeah,
I mean it's like ironically, as in all the ways
that the the the uprisings right now are taking place
in the same way at the same time as the pandemic.
It's like at the same time that we're the other
countries have enacted contact tracing and that sort of thing,
and that's something that we're talking about here. It's like

(34:42):
we also see we're we're immediately living through the downsides
of something like contact tracing um currently, as opposed to
like a hypothetical that could happen sometimes. Yeah, I mean,
contact tracing is a great example of just like how
wild tech hubris is, right because we are, you know,

(35:04):
the foundation we are in a failed, failing, present progressive
state right now, right Like, we don't have testing that
is clearly deployed. We certainly don't have a social stafety
net that would enable people to make a sort of
choice about their protection, right Like, oh, maybe you know
I have received a ping that I was in contact
with someone who tested positive, which presumes the presence of tests, right, Um,

(35:26):
and now I need to stay home for two weeks, right,
you know, most people don't even have two weeks vacation
time to take let alone you know the you know,
you know the resources to you know, make that choice
for you know, community and self care. So I think
what we're looking at, you know, the idea that you
could just slap contact tracing on that when you don't
have ground truth data around testing, when people aren't in

(35:49):
a position to you know, make a choice for public health,
when we don't have those politics of care right you
know anywhere in our infrastructures. Is you know, that's the
kind of like we'll solve it with tech while ignoring
those sort of bedrock conditions that would be you know
necessitate you know that that are necessary for tech too,
you know, for us even have a conversation of whether

(36:11):
sort of tech based contract trasing contact tracing could work,
and there's a bunch of problems with tech based contract tracing,
especially Bluetooth BASTE. It like doesn't know when it's going
through walls so you can get a contact trace that's
like if you're next to somebody in a car, it
will consider you next to them, right. Like, there's all
sorts of like you know, real world lizations to the
hype that is being sort of you know floated around

(36:32):
these these approaches, and you know, and again you need
you need test data to be using test data to
do contact tracing, and you need you know, frankly, contact
tracing is best done by like an army of human
beings who are doing the hard work of of you
know what you know, public you know, doing the work
of public health experts. Yeah, yeah, Meredith. One thing I

(36:56):
really love about your work you use the word tech
hue breast Um. I think you're really good at calling
out tech hubris, and especially at sort of talking about
promises that tech companies make that they can't actually deliver
on and tech that doesn't exist that's being sold. Um.

(37:16):
You had a good tweet the other day about uh
software that was being sold I think to the Pentagon
that was supposed to recognize human emotions. And you said,
let me save you a billion dollars and tell you
that this doesn't work. Surprised. Oh, it's so, y'all, it
is so it. I mean, there's that we've we've had

(37:39):
twenty years, I think, and and more. I'm not you know,
I'm not a trained historian in this, but I've like
came up in fact, right like we have just sort
of trusted this narrative that you know, tech company revenue
gains are you know, are synonymous with progress, right, with
scientific progress, with social progress, with sort of you know,

(38:00):
the betterment of the world. And I think we all
know that's more or less bullshit, but I think it
has combined with this kind of you know, the way
in which tech supremacy has become part of our are
just general narrative, right, and people are afraid to question
these technologies, right you see, you know, there's there's this
constant kind of derisive commentary on Twitter, like if only

(38:23):
senators were smart enough to question Zuckerberg, right, But like,
we are in a context where, for some reason, we
made it okay for people with a sort of elite
technical education that had access and expertise a certain information
most people don't to sort of, you know, assume that
the right to make significant policy decisions, significant social decisions,

(38:45):
significant decisions on behalf of you know, millions and billions
of others, you know, rested with the you know, with
only um that that only people with this training sort
of had that right and that no one who you know,
if you didn't have a CS degree from Stanford, you
can't actually be talking about these things. So I think,
you know, under that sort of smoke screen. You know,

(39:07):
so much of this stuff is a scam, right, that
emotion recognition stuff like that is being piped into every
market you could imagine, Right, there's job interview software that
based on your tone of voice, based on your micro expressions,
based on you know, how you you sound, what your
mannerisms are, will determine if you're going to be a
good worker or not. Right, Like, we're looking at something

(39:28):
there's no scientific consensus around this that you can automate
that type of detection, and you're looking at sort of
the resurgence of principles of physiognomy and race science and
this idea that someone's worth can be determined by their
sort of you know, external physical characteristics, which you know
hasn't there been mentioned of that, Like then using that

(39:50):
sort of thing for policing too, or am I misremembering
because I feel like it's it's you know, one of
the issues where you know, people are like give us
specific evidence of this. And one of the issues is
that a lot of the technologies that are being used
are produced by private firms and then they're procured by
governments or private businesses, and that whole process is either
you know, if it's if we're talking about like policing agencies,

(40:12):
often times it's classified or hidden based on national security.
If we're talking about private firms, it's hidden behind trade secrecy.
So we know some examples, but I think it's it's
also important to recognize, like it is that the obscurity
is structural. Right, it's intentional, and there's a lot we
don't know, but we do know that. You know, police
departments are being sold facial recognition systems that claim to

(40:33):
sort of detect aggression in people's faces, and obviously when
researchers have looked at these, they are like extraordinarily racist, right,
or looked at similar systems, right, because getting access to
some of these is also hard because of what I
just mentioned. But yeah, policing is using it. The military
is like the military just a big old magic that
the military is certainly using it. Uh, schools are using it. Uh,

(40:54):
how it's wait, in what contexts are schools using it?
Are our students attempted? Of right? Are they paying attention?
It is a bed racket that is funneling um money
for help for a sort of security that was kind
of pumped into school districts following, you know, because of
the school shooting problem, and that money is now being

(41:16):
used to buy sort of security cameras to you know
that that are yeah, and and at police to schools, um,
but a lot of those, you know, like detect suspicious activity. Again,
this is sort of the inference that what your body
looks like and what it does can somehow translate into
a magical assessment of uh, you know, of criminality. That
is so terrifying. I wanted to ask you also, just

(41:39):
like what especially when you think about how that could
also be applied now that students are going to be
doing remote learning and they're all going to be on
zoom and I mean, it's the idea that that could
kind of evolve in an even more sinister ways it is.
There's online proctoring software that now sort of you know,
claims to detective insiveness, but we're talking about yeah, yeah.

(42:03):
I also wanted to ask if there was a point,
if you, so you've been in the tech world for
a long time, what was there a kind of evolution
for you At a point did you become kind of
like more concerned and more aware of these issues and
when you entered the tech world, like how aware were
you of the kind of like ways that this could
be manipulated in racist or really you know, to like

(42:27):
eliminate you know, the personal privacy of users? Like when
did did you kind of become aware of that in
this sense? Mm hmm, I mean I don't have like
a there wasn't one crisp revelation. But I you know,
I started at Google in two thousand six, right out
of undergrad and I took the job at Google because

(42:49):
they offered it to me and I was broke, right,
so like for me, it was very it was very simple, right,
like rich kids went to grad school and that I
was not one of them, and I needed a job
because there was a certain amount of money at my
bank account and it needed to be more money if
I so like, um, you know, I I also, I mean,
I don't know, I come from a class of where
like the point of a job was to do the

(43:11):
least amount of work and get the most amount of money.
And there wasn't this sort of identity like your job
was not who you were or your passion necessarily, so
I kind of entered it like that, right and then
and then I was just you know, and I did
come from I kind of it's been involved in sort
of radical politics, radical spaces as a teen, and that
was sort of where I was at home. And so

(43:32):
there was a whole wild time of being kind of
entering Google and being like what is this? Right? Like,
I don't have technical training. I had to figure out
a lot of these things by like asking people, um,
and I think part of that process was like a
demystification journey for me. So what that you just call
it that, but that's actually not like that's just a
confusing word. Okay, I'm like and recognizing that it wasn't magic,

(43:55):
that a lot of the ground truth was really shaky,
that a lot of this sort of you know, a
lot of this was sort of heights. But that took
me a while, I think, you know, just being like
asking these questions and sort of being in the like
arriving at this time where it was just there was
little pockets of kind of academic tech critique, but there

(44:15):
was not the sort of wholesale kind of skepticism that
I think is you know now part of a you know,
a healthy conversation about tech, right, And so just saying
like why do these guys you know, think they're ethical billionaires?
Like what is ad tech? Right? Like what does it
mean to sort of have this data? What are the
excuses they use? Um? So that was I mean that
was thirteen years. So like over that time, I was

(44:36):
like interested in social issues, right, Like that was where
my work gravitated. I worked on privacy, I was on
that neutrality, I worked on a lot of these other things,
and I got interested in AI because AI is sort
of like, I don't know, I have a whole feel
all save for some other time around it kind of
it's kind of a scam. It's like this big term
under which like data based technologies, right, technology is that

(45:00):
make inferences about data? Are you know, kind of marketed
right now? And there's a number of reasons that sort
of emerged in the last ten years around sort of
you know se tech company sort of consolidation of power,
the consolidation of data, the consolidation of powerful infrastructures, whatever.
But you know, the mythology around artificial intelligence that it

(45:22):
was sort of infallible, that was actually you know, more
capable than humans, that it was you know, it was
you know, going to solve any number of you know,
so dual purpose. It could solve any number of problems
from healthcare to schools to you know, policing. Um I
think was you know, I was really concerned about that
because I was like dealing with issues of like how

(45:42):
clunky and fallible data was, right, Like how you know,
even when there was a lot of intention behind trying
to like get data right, it was always like clergy.
It was always just an artifact of like institutional processes
and compromises and perspectives. Right. It wasn't there was no
you know and run to get sort of that cart
hesion window of objectivity, right. But people were building these

(46:03):
systems on you know, crappy data with crappy assumptions and
selling them a sort of magic into some of the
most sensitive social domains you could imagine. And it was
you know, it was consolidating power. It was threading these
you know, it's like a hand five companies have the
resources to make this kind of technology at scale right now,
So it's sort of threading all of our social and
economic infrastructures through these five companies whose core function as

(46:26):
always forever and nothing else but to grow revenues increasingly
infinitely over time, right, Like I think I think that
when people think about artificial intelligence and machine learning and
and you know, the kind of using machines, are using
uh any kind of tech to replace a workforce. You know,

(46:47):
we think about like a labor workforce or industrial jobs
and that kind of thing. But I think that kind
of we're seeing more and more the core philosophy of
that and the actual failings of it, of the failings
of the um of taking for granted that it's better
to not have a human do a job, or that
it's somehow more advantageous, because when you're talking about stuff

(47:09):
like teaching, policing, all these things that we've have been
talking about, like those are those are cases where they
are like, it's good to have a human there, it's
good to build human skills like those sort of intangible
skills are are priceless. And that's just not a part
of the tech philosophy, like the big tech philosophy right now.

(47:30):
Um so that I've seen. Yeah, it also feels to
me like the people who are in charge of tech
are part of the reason that this is all being
handled so badly, because you know, in my experience, all
the people who have been like the Canary and the
coal mine, have been people like you and other like

(47:51):
women and especially women of color working at tech companies
who noticed that, like the algorithm is racist and that
we shouldn't be selling it as like the subjective thing um.
But there's a lot of money obviously behind people sort
of being like, nope, it's perfect algorithms. You know, the

(48:12):
math is infallible and this is math. Yeah, they love it, right,
And I think, I mean, I think one thing around
that as well, as you're having this sort of confluence
of like that type of tech huberis and these sort
of you know, these kind of hollow marketing claims on
the tech company side, and then you're having you know,
decades of neoliberal austerity on the other side, where the

(48:35):
offer of replacing a hundred humans with a machine that
you only have the license for so much is really tempting, right, Like,
and I think there's a sort of you know, there's
a feedback loop there that's extremely dangerous. Like you had
in Michigan, there was a they installed an algorithmic system
that was replacing unemployment adjudication, so it would they used

(48:56):
to have a staff of people saying like, oh, this
you know, unemployment benefits claim. Uh, looks like frauds, right,
and we're gonna chase that down. They instead passed the
machine to do that fired all those people. Forty thousand
people were wrongly accused, right, and then that creates a
record that means you can't get benefits, you can't get
another job, right, like they were bankruptcies. There were suicides

(49:17):
like people, And that's like one example of this type
you know, the way this type of logic is operationalized,
like in Michigan, we need to also name like Rick
Snyder and you know, the person who presided up a
flint like that was the logics that are sort of
driving this. And then of course there's a company on
the other end, like welling To willing to sell some
like you know, Jankie bullshit that hasn't been audited because

(49:39):
no one involved in that sale and that propurement and
those marketings are going to be the people who are
like harm by it, right, and the people are subject
to it, you know, have no freaking idea that there's
an algorithm on the other end because they were never
part of that process. There was no democratic decision making.
So well, that's also you had testified before Congress Um
at some point to talk about the need for over

(50:00):
site and transparency and AI I think that that ties
into this because there's we we just are never like
a victim of that kind of thing, is never necessarily
aware of the fact that this decision is being made
by a boat basically um. And so it becomes I
think harder to figure out how to address those things
if they're if they happen wrongfully, because there's no kind

(50:22):
of chain to follow up. But I wanted to talk about,
you know, when when you kind of bring attention to
these things and in your activism, you know, what kind
of challenges are faced by people who come forward about
these Like I know that you know, after the Google
walkouts and everything that that was pretty contentious, and you know,
I feel like generally it seems as though people who

(50:44):
bring attention to these issues in big tech are really
kind of facing an uphill battle. So, I mean, how
do we kind of advocate for people who are whistleblowing
or who are advocating for changes to be made? Yeah,
I mean I think like it's no surprise, right of
you know, they were fine with me when it was
just me as a critic saying things right, and even

(51:05):
if I called them out, even if I drag them like,
you know, I might get some of me like come
on there, but like they actually like that served a purpose, right.
It made it look like Google was a place that
accepted all voices and you know, a robust debate even
if I had no power to actually make a decision
based on my research and my insights. Um. It was
when I started organizing, so you know with my colleagues,

(51:27):
when you actually start building power, which requires solidarity in
power asymmetric situations. Um. That they were upset, right, and
it was classic. It was there was no you know,
there was no tech wizard reinvolved, right, like they got
some old quirky union bus name firm and to be
like you hurt the people leading the people might organize, right,
Like it's you know, it's pretty simple. And I think,

(51:48):
you know, recognizing that fundamentally this is about power. Fundamentally,
there's nothing particularly smart or different about tech, although the
way it's you know, it masks and centralizes power some
and we need to recognize when we're sort of in
these struggles and doing this research, um, but that you know,
it's going to require solidarity. It's going to require sort
of new forms of solidarity in my view, that are

(52:10):
kind of across you know, what people have called supply
chains or you know, seemingly disparate sites. So if we're
just looking at workers that receive a full time employment
check from Google, we are not looking at a coalition
that has the power across all of the domains where
Google is threaded to actually check them. Right, So we
need to be thinking about, you know, what does it
mean to be a worker, What does it mean to
be in solidarity as a worker, and what does it

(52:31):
mean to be in solidarity with the communities, like you said,
who are subject to these technologies that we as workers
may not be subject too because of certain privileges. Right, So, like,
how do you you know, how do you allow those
movements to actually lead? If you're talking about sort of
worker solidarity actions and I think you know, there are
a million ways to support it's just you know, support

(52:52):
like big where you stand. Yeah, I'm so curious because
you you said you you started a Google in two
thousand six, right, and you've been working in tech since then,
And I feel like that kind of that time period,
especially with you leading the walkouts. It's like the the

(53:14):
the attitude and the and the trust that the average
person has in tech companies and their understanding of what
they are and and the fact that they are about
power and in many cases of fruscating it by like
giving you free email or whatever the case may be.
It's like now you're in control when really, like you,
like at Google, by having the slogan be don't be evil. Sure, yeah, yeah,

(53:37):
like all of that, like kind of having the scales
fall from our eyes over the past decade or something.
I mean, what's been your experience of that from the
inside and especially among the people who work there, because like,
that's that's been the crux of it, right, is that
people are like, oh, wait, this is not like I
don't know if I signed up for this, Like I
don't know, I don't. I mean, I think I had

(53:59):
a little bit of a head start just because I
was sort of I was, you know, I don't think
it's an accident that the people who have sort of
lad the research and the critique and sort of the
you know, the insights were drawing on as sort of
tech critics or whatever you call us, um are women
of color generally, right, Um, you know there is something

(54:21):
about you know, coming into those spaces and you know,
always already not belonging because of how white they are,
because of how you know, these are deeply racist, deeply
misogynist spaces, like you know, some of like wildly so
in some cases. Um, I think you know, there's sort
of a sedimentary a cruel of experiences that has happened.
Like people talk, there was always a whisper network. There

(54:43):
always is a whisper network, right, Um, you know I'm
not you know, a lot of things sort of happened
and changed. Like when I started in two thousand and six,
there were you know, somewhere between three and five thousand
workers at Google. Right there are over two or thousand now,
and you know that is made up of half, you know,

(55:04):
more than half of those are contractors and temps, which
is also the trend of sort of you know, Actually,
work isn't safe for anyone. There is no such thing
as non precarious work. And you know, if it hasn't
happened to you yet, it's happening to you. Right, there's
a whole you know, historical changes Trump happened, right, you know,
Facebook cameradge Analytica happened. There's been you know, both a
swell in the power of these companies and at the

(55:26):
same time, so I'm just like kind of egregious and
embarrassing disasters. Um. So I you know, I don't think
it's any one thing. I think all of those things
that you know, I can speak at Google. The thing
that I think more than more than any other one
event sort of shifted the tide was the Naven contract.
And this was a secretive contract that Google had with

(55:47):
the Department of Defense to build you know, drone targeting
and surveillance AI for you know, for the drone program, right.
But it was facial and object recognition. Is that right?
It was object recognition. Um. It had a sort of
person category when we look through the code, right, like
it's you know it is and it's trained on images

(56:08):
from you know, from the Middle East. Like again, we
don't know. But I think this also gets to like
there's an idea that you know a lot of the
people who are sort of pushing back against our descent
there against our call to cancel this Maven contracts with
maybe with the name for it. Um. You know, we're
saying like, you know, well, the best technology will actually
you know, save lives, right, Like, how are you withholding

(56:31):
this from the people who need it? What you're doing
is actually sort of you know, misguided, and you're going
to cost lives because they don't have the intelligence they need.
And somehow automation will help that, but you know, it
doesn't answer the fundamental question like your your data is
only good as your impel Right. There isn't a magic
you know, amount of data that's going to tell you
more than what you know. And this data is always

(56:52):
always irreducibly labeled by human beings, right, it only reflects
those insights and so when you know again, I think
this gets back to an earlier point around the sort
of like eliminating or displacing jobs. It just I think
it it degrades and hides those jobs. Right, But there's
always an army of precarious workers who are labeling that data, right,

(57:12):
saying like this is a you know, this is a house,
this is the sheep, this is a person that trains
the machine learning algorithm um, and it never gets better
than that. Right, Those are the the core subjectivities that
are embedded in that data. There is no sort of
magic wand right, it's all people all the way down.
But it obscures you know, where was there a human choice?
Where is there accountability? You know, is it Google or

(57:33):
the d O D who is accountable for another civilian
bound strike because it was informed by this software, etcetera.
Oh my god. So you're saying there's a person inside RoboCop. Yeah, yeah,
an army of people and they're not paid. Well, that's
the thing. I mean, that's like I worked in at
a tech publication for a couple of years, so sort
of like I mean, I was covering entertainment and didn't

(57:54):
no ship about tech, but I was sort of in
the world. It was around me in a way, and
like that was the thing that always sort of baffled
me on a very very abstract philosophical level. Again, this
thing of just like why do you deny that there
are humans? Like why deny the presence of humans? And
why does that equal the future? You know, it's such
a it's such a strange ground groundwork to start from.

(58:19):
Like if you're a tech if you're one of these
dudes who runs one of these companies, if you're if
you're a Peter Teal or something like, why I don't know, like,
why is that what your assumption of what progress looks like?
It's very strange. I don't I don't ask you to
have any psychological inside on any of these people, but
if you have any, that would be great. Uh, I mean, yeah,

(58:39):
they're not. There's kind of a cult of the socially awkward,
which has been used to minimize and justify bad behavior
for a long time. But I also think, you know,
as a materialist, I'm always like, well, it's a lot easier,
you know, if you can sell something as replacing kinds
of humans, you know, do it right. Like you don't
have a Genius of AI story or a the Amazing

(59:00):
Founders story. If you counted all that labor right, you're
not the heroes. If you recognize that this actually takes
a massive infrastructure that is built on irreducible human labor,
that is built on you know, this sort of you know,
extractive economy of you know, you know, taking or making
or faking a bunch of data to do all of
this right, it's not actually genius that's happening. It's sort

(59:22):
of the you know, uh, you know, regulatory arbitrage and
resource extraction. Um, it's not as it's not a story
that's as marketable, right, And um, I think for too
long we've just kind of guilelessly believed this idea that
you know, if you had an algorithm and an idea
in a garage, you could you know, you could create
something new and and and us. When we see these ideas,

(59:44):
they are progress and they are do represent a sort
of you know, positivist arc towards something better. Right. It
seems like there's a lot of like, if you can
do something, you have to um, which reminds me a
lot of like the D d T thing in the
twentieth century of just sort of hey, look what we
can make, uh, And people happen to be like, oh, actually,

(01:00:05):
just because you can make a poisonous gas, it doesn't
actually mean it's like an advance of technology or just
you know, but there are reasons not to do things.
There are moral and ethical reasons not to do things
that can can be done. But what I love about
your work so much is also you're saying, like these
things don't even work most of the time. No, I

(01:00:27):
mean they worked to you know, extract money and because
all their power, right, So like in a sense they
are working because like, ultimately, who's measuring what? You know,
they're kind of measuring that. But yeah, these don't know,
they're they're jankie, they're fallible, they're incontestable, they're frustrating. So
you would you you would you include the thing that

(01:00:48):
everybody's talking about right now is like facial recognition SoftWhere
UM at protests and stuff like that. Do you think
that that's ultimately not um a huge concern or the
technology just isn't just isn't in there yet, Like yeah,
I mean, I gotta be careful because yeah, it does,
it can recognize faces, right, but of course it um

(01:01:08):
it reflects the historical patterns of racism and misologyny that
any technology created in you know, in the context of
you know, it's time will reflect um and so it is.
You know, facial recognition software has been shown repeatedly to
produce far more errors for black women, for Native people,

(01:01:31):
for gender nonconforming people, So it's you know, yeah, it
is dangerous because it can, you know, live facial recognition
can track who's there, match them with a database of identities,
which because of driver's license databases and sort of government
I D programs and other programs where you know, like
Flicker where we've or Facebook where we've uploaded our face
with our name a number of times, and that can

(01:01:51):
be you know that is that is possible and it's
very dangerous. And I have you know, I have advocated
sort of banning facial recognition outright, Um, give in those dynamics. Oh,
I just wanted to see if you had kind of
any guidelines for people. I mean, we're right now conducting
this podcast. We're zooming so we can all see each other.
And I know that the way that we use technology

(01:02:12):
has changed so much, um, even over the course of quarantine.
It wasn't like we kind of started off quarantine doing
all the things that we now have to do to
keep up with work. But for you, who knows about
these things, how would you advise other people to take
precautions to protect themselves against being you know, having facial
recognition software used against them and things like that, Like,

(01:02:33):
how do you personally kind of operate to protect yourself
And how do you recommend that people who are either
going to protests or just doing a lot of like
zooming and zen castering and stuff like that, how do
you recommend that they take precautions. Yeah, I mean there
are a bunch of good security guides for protests, but
like wear a mask, right, mass laws are are off
right now we can wear masks. That's really big. Um,

(01:02:54):
we're big sunglasses, right those those do you know perturb
facial recognition. It's good to do that. Um. But I
think you know, the main message here for me is
this is not you know, a kind of this is
not a problem of kind of neoliberal individual responsibility, right,
Like we are looking at yeah, and we're looking at

(01:03:14):
you know, we have we live in we have you know,
we are in a world where participation in you know,
social and political and economic life requires are using these technologies, right,
Like going into a cave and only using tour isn't
going to you know, there are a number of reasons
that won't work, but it's not you know, it's not
a choice for most people, and I think we have

(01:03:35):
to recognize that. You know, the the things that I
am most excited about are the advocacy and the organizing.
Right the the we we saw last year in and
twenty nineteen sort of a wave of facial recognition moratoria
and bands that were organized locally. Right, people said no,
you can't, you know, use these technologies. They showed up
at their civil City Council meetings, you know, they showed

(01:03:56):
up at their schools. There's been organizing around of the
use of these tools in schools by parents. But I
think it's you know, I really do think that's you know,
where it starts. I think there are people looking at
sort of all axes. You know, there's a no Tech
for ICE campaign, which has been really good at applying
pressure on the tech companies that are providing disurveillance and

(01:04:18):
tracking infrastructures that have fueled the family seperation policy and
are now you know, we now have those same companies
working with the Department of Health and Human Services and
the CDC to collect COVID data. It's unclear with the
data sharing agreements between you know, CDC and ICE might
be or HHS and ICE might be. So like, there
are all of these things that are going to impact

(01:04:39):
you know, people, and that people are already organizing around.
So I think, you know, sort of echoing the credo
like look at you know, who's doing what around you
and see where you can be helpful. But I don't,
you know, I'm really I reject sort of shaming people
for using technology. I reject that mode. Um, but I
do think you know, connect it to a material issue
that matters to you, and you're of people in your community,

(01:05:01):
and you'll probably find a place where this stuff is
um doing some harm and where you could do some organizing. Yeah.
I mean, I feel like pointing out that they're humans
behind all of this stuff, um, and that they need
those humans for these companies to function. Um. Just with
the Amazon stuff that was happening with warehouse workers uh

(01:05:22):
uh those people. Yeah, I think just to to let
to empower the workers to know that they the companies
need them. The companies can't function without them, Yeah, and
that gives them power. Like great organizing going on with
gig workers, your Taxi Workers Alliance, UM, there's the Drivers
United on the West Coast. There's organizing with instacrat workers

(01:05:44):
and gig Workers Rising. There's um the Gig Workers Collective.
There's there's a lot of really interesting organizing. It sort
of at the forefront that is beginning to sort of
flex and withhold labor um and has seen changes based
on that, and so many of those tech companies seem
like they were just made to get around labor laws.
It's like the joke. It's like the move fast and
break things, and what you're breaking is labor laws and

(01:06:07):
regressing to like, you know, old piece work laws that
or old piece work norms that. Like I would I
would point to the work of a legal scholar being
a new ball who has looked a lot at this,
and it's like, actually, this is just a return to
the past with some like entrepreneurial language left on on top.
One thing I've also noticed is that there's like a

(01:06:28):
reluctance in tech companies to admit that they're doing anything wrong. Ever,
I had a friend who's working for a company that
took a contract with Ice, and when she called them
out about it, they were like, well, we don't know
what they're using that facial recognition technology for. They could
be using it to reunite families. She quit. She quit,

(01:06:50):
but more power to her. Yeah. I mean they'll say ship,
they'll say some wild ship, right. Like I remember being
up on stage debating Mayven with some executives at Google
and they were like, well, tech is a hammer. A
hammer can be used to harm or to build, And
I was like, that is like from people who sort
of claim the expert knowledge on these systems. It was

(01:07:11):
just a wildly lazy statement, and I think it's clear
that like these people actually haven't thought through what they're
doing right. They have allowed these sort of like extreme
like whippers in mythologies to assure them that they're anything
they do is doing good. And now they're confronted with
some extraordinarily complex questions that they don't have the training,
they don't have the background, and they don't have the

(01:07:33):
stomach to answer honestly. And so you get these like
strange ass platitudes for people who are working for these
companies in any capacity. And I'm sure we have people
and among our listeners who do whether you're a driver
for Uber or a programmer or whatever, what what's your
kind of stands generally on Like this thing that Molly

(01:07:55):
was talking about where you know, her friend decided not
to work for this company. Is it the I feel
like this is a debate all the time where it's like, well,
should I me a good conscientious person, uh enter this
community and hopefully be some sort of agent of change
or good with it it, or should I just stay
out of it altogether? Like what do you what's your

(01:08:17):
feeling on that. I mean, I you know, this is complicated,
like people need to eat, people need jobs, like we
don't you know that is a problem, you know, like why,
you know, why why do we have to spend so
much of our finite magical lifetime, uh in the service
of wage work. I think that's a question sort of

(01:08:38):
you know that frames this a bit. But I don't
you know what, Like I don't know. I was poor
growing up, and one of the things who gave me
was sort of access to sort of a class and
we'll learn those mannerisms that like I wouldn't have had
other ways, you know, in college did that to an
extent um. You know, I started organizing when I realized
what I was doing as a sort of critic and
like internal like a researcher isn't actually helping things. But again,

(01:09:02):
I think that's you know, if you're going to go
up against power, do it with other people. If you're
in a position to do that, do that. Recognize that
there are a lot of people who aren't always in
that position. You know, there are people on visas, there
are people who are you know, the soul breadwinner, but
that there are ways to sort of at least begin
to build like the relational bonds of solidarity with the

(01:09:24):
people around you, which in itself I think is is
extremely radical. Right when you begin to replace your identity
in a workplace, you know, when you stop identifying with
your sort of position within a hierarchy as who you
are and how you fit in this space, and start
identifying with your relationships with the people around you and
what you owe them as sort of a practice of solidarity.

(01:09:45):
It's like a transformative thing to do. And it is
like there's a reason that, um, you know, like capital
has fought back against those forms and that those forms
haven't changed much over you know, years and years, right,
that like that is of how you build power. So
I think at least engaging in that and then you know,
knowing that to be safe, you're gonna need you know,

(01:10:07):
you're gonna need critical maths, so figure out where you
start from there. Well, that also, I think by talking
about how you know, we need to engage with technology
as both users and in a many of us in
like a professional capacity. The fact that it's it is
such a privilege to be able to opt out of
those things that almost none of us are afforded right now.
So it seems like the best path forward is to,

(01:10:29):
like you said, to to organize, to recognize the power
of kind of speaking up and like you know, uniting
and like being critical of these institutions is really the
only option. Like I I can't imagine right now many
people can afford to quit a job. It's probably the
worst time in the world to consider that. Uh. And
also just you know, the connection offered by social media

(01:10:52):
when everyone is isolated, it's you know, it's a difficult
time to consider just like making the choice to opt
out of those things. So that makes sense. Yeah, And frankly,
you can't opt out, right. You are the surveilled, right,
collecting it from street polls, They're collecting it from sensers,
They're collecting it from your credit card, They're collecting it
from the geo data on your phone. You know, you
cannot carry a phone, but then you can't get work email. Right,

(01:11:14):
They're collecting it, you know, from data brokers that you know,
pull from you your family's social media profile. So there
isn't again, it's not an individual even if you were
to be sort of incredibly squeaky clean, even an absence
of data can be tagged a sort of data, right,
so UM, it's not I think users is actually almost
a data term in this case because we're UM, well,

(01:11:41):
thank you so much for joining us this week, Meredith.
The suspense that should uh enlightening conversation. UH, and we
really appreciate all of your insight and experience and and
everything you have to say. It's great. UM. Where can
people find you online? You can find me on Twitter. UM.
You can also visit the AI Now Institute website for

(01:12:03):
some of the work that UM, you know that brilliant
team is doing that I'm doing with them. UM. Yeah,
I think that's that's about it. UM. I would also
encourage people in tech to check out Tech Workers Coalition UM,
which is a you know, a loose organization that is
helping workers in tech organized UM and understand their power

(01:12:25):
and its contact. Awesome, awesome, Well, thank you so much, Meredith,
and we hope to have you back because this was
a really really great conversation. So thank you so much.
It was a delight. Have a great boy. Thank you
bye bye. Well, thank you so much for listening to
another episode of Nightcall. If you enjoyed the show, please
give us a rating and review, and don't forget to

(01:12:45):
subscribe on iTunes or wherever you get your podcasts. You
can also follow us on social media. We are a
Nightcalled pod on Twitter, Nightcall Podcast on Instagram and Facebook,
and if you'd like to help support the podcast, you
can support us at Patreon dot com slash Nightcall. If
you'd like to give us a nightcall, please give us
a call at one four oh four six night or

(01:13:07):
you can email us at Nightcall Podcast at gmail dot com.
Thanks again for listening. We'll be back next week. M
Advertise With Us

Host

Molly Lambert

Molly Lambert

Popular Podcasts

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.