All Episodes

May 10, 2024 51 mins

Campuses are canceling and shrinking graduation ceremonies after waves of student protests.

The protests have also triggered a wave of harassment, doxxing, and online harm.

Caroline and Sam, digital security and privacy experts with Convocation Research and Design, share some shocking facts and thought provoking perspectives about state surveillance during the protests. 

They also describe steps people can take to help protect their own privacy and safety. 

Follow Convocation Research and Design’s work: https://convocation.design/

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
There Are No Girls on the Internet. As a production
of iHeartRadio and Unbossed Creative, I'm brigitat and this is
There Are No Girls on the Internet. Graduation is supposed
to be a time for eager college grads making speeches
and walking across the stage to receive their hard earned diplomas,

(00:26):
but campuses around the country at places like usc Columbia,
and Emery have either shrunk or canceled graduation ceremonies in
the wake of protests demanding universities divest from Israel, in
other words, withdrawal any funds their university endowments have invested
in companies that are linked to Israel. The protests have

(00:48):
really escalated recently, and even before that, A digital billboard
truck funded by a well moneyed conservative group drove around
university campuses at Columbia, Harvard, and UPenn with pictures and
names of students, accusing them of being quote their campuses
leading anti semites. One student whose picture and name were

(01:09):
on a billboard truck at Columbia spoke to Verge, saying,
I literally did not leave my apartment the days my
friends told me about it. The student said that her
name was listed on the truck because a club that
she was no longer a part of had signed on
to an open letter urging Columbia to cut ties with Israel.
I completely wiped as much as I could of my

(01:30):
online presence and stated my apartment as much as I could,
the student said. Meanwhile, Canary Mission, a website that lists
people perceived as being supportive of Palestine in order to
keep them from getting future job prospects, has stocked students
across the country. It is your duty to ensure that
today's radicals are not tomorrow's employees. A video describing the

(01:51):
project explains Caroline Cinders and their colleague Sam are part
of Convocation Research and Design, or Chords for short people
who are being doxed or at risk for being doxed,
train people on how to stay safe both online and
off while engaging in protest, and advocate for technology to
be designed in more responsible ways. When I first caught

(02:12):
up with Caroline and Sam, I thought this whole conversation
would be pretty straightforward, you know, turn off your phone
out of protest, that kind of thing. But as we spoke,
Caroline and Sam described just how charged of a landscape
we're talking about. And I realized the conversation is so
much more complicated than that. So when I first reached

(02:35):
out to y'all to do this episode, I thought it
was going to be a pretty basic guide for how
folks protesting on campus and off campus can stay safe digitally.
I thought we'd be talking about pretty basic things like
use a pass code to unlock your phone so that
if you're detained by police they have a harder time
getting into it, or cover any distinctive tattoos, that kind
of thing. But then when you and I started talking,

(02:58):
I realized there was so much more going on here,
and then I needed to take a step back. You
all told me about things like provocateurs setting up fake
dating profiles and then going on fake dates with protesters
and trying to secretly record them saying things to make
them look bad James O'Keeffe style, or how right wing
influencers are kind of selling themselves as a citizen journalists

(03:20):
and trying to make names for themselves by engineering viral
content of protesters looking bad on top of threats to
protesters that you might traditionally think of, you know, threats
from campus administration or police. So can you just give
us kind of the lay of the land in terms
of the threats that these student protesters are facing right now.

Speaker 2 (03:40):
These threats are wide and varied. I think student activists
are seeing threats not only from outside, but from inside
as well, So they're having to contend with school administration,
sometimes hostile students who are against what they're protesting for,

(04:04):
as well as outside forces like the police or outside
sometimes infiltrators like people like Project Bearitas style where they're
trying to get into protests in order to record people
who they perceive as their political opposition in order to

(04:27):
make them look bad and then posting that on the
internet for clout or subscriptions or money, as well as
a variety of other threats that I think a lot
of people don't really take into account.

Speaker 1 (04:42):
So you talked about people doing this for clout or engagement.
Do you think that's a big undercurrent of what we're
seeing here, Like people who are like, oh, this is
a great opportunity to put out some viral video that
makes a protester look bad or look uninformed, or look
threatening or dangerous. I'll put that online and become a
superstar myself.

Speaker 3 (05:00):
I mean, we are more in a time of the
kind of like citizen journalists some of them call themselves,
or like man on the street interviews where those are
very popular on like TikTok or Instagram or Twitter or
YouTube or rumble, like the list goes on and on
of all these different platforms that they're able to monetize
these videos for and it is very big in the

(05:24):
the cultural climate that that these protests are happening in.

Speaker 2 (05:32):
So yes and no, like they are a small part
of these protests.

Speaker 3 (05:37):
But any large event I feel like that's happening happening
collectively in the country is going to have probably a
similar amount of people that are chasing it for some
type of engagement and.

Speaker 4 (05:52):
Just to build on that. You know, I really do
think we are in the age of the influencer. Like,
I think that is something a lot of folks to
feel comfortable saying. It is really hard to say, you know,
citation needed in terms of how many folks are attending
these protests in person or following them online right and

(06:13):
commenting in a way that is related to influencer or
citizen journalism. Like there are a variety of organizations that
are trying to identify student protesters and docs them and
put their information online. Like one that comes to mind
is Camary Mission for example. But yeah, there are a
lot of folks that are either or not a lot,

(06:34):
but there are there are definitely folks that are showing
up to the protests that are we could argue, have
you know, maybe some more malicious intentions that are in
this morsus in journalism or influencer space. But there's also
lots of folks watching online trying to identify different different

(06:54):
folks at these protests and sort of doing a kind
of let's say, like recap or an analysis. And this
is stuff you know, we've seen in many other different
kinds of spaces and ways. Like what comes to mind
for me is, you know the Johnny Depp amber Heard
trial where we saw so many sort of armchair investigators
and heavy heavy quotes, body language experts like this, this

(07:20):
is the lip readers and things like that, and and
but that had like a real impact on how people
really perceived that trial and how I would argue in
really discoloring amber Heard not as a victim but as
a perpetrator, when in fact she was a victim, right,
And so I think, you know, we're still seeing a
lot of that here and now with folks I think

(07:44):
also maybe not physically being at the protest, but commenting
on them, analyzing them, right, and providing different kinds of
like wrongful assessments.

Speaker 1 (07:56):
And similar to what we saw with the Amber Heard
Johnny Depth trial. Do you have you seen that those
kind of armchair folks who aren't necessarily involved in the
protest but are like you know, commenting on the viral
video they saw from the protests or whatever, that that
is having an impact on how every day people are

(08:17):
perceiving what's going on on college campuses across the country.

Speaker 4 (08:21):
I think it's leading to like stratification for sure, and
I think some of that is also you know, I
would argue actually also coming from a place even of
sadly like traditional media. One of the things I like
to highlight is we don't live in a harassment literate society,
and I think a lot of the analysis that is

(08:41):
sort of existing around this, this this particular conflict, right,
is sort of caught up in a really lack of nuance. Right,
So the inability to see let's say, the people of
Gaza as individuals that we are watching a genocide unfold,
that we are watching things where many respected, reputable international bodies, right,

(09:07):
are weighing in and saying like this is unprecedented in
terms of like a famine that's being caused, in terms
of the attack, in terms of the amount of journalists killed,
in terms of the amount of children that are killed
in this particular conflict, right, And a lot of what
I think a lot was happening is a version of
context collapse in which I think some folks are weaponizing,

(09:33):
are weaponizing anti Semitism to push for other agendas, and
that is also creating a space in which it's very
difficult to have one could argue a nuanced conversation, right,
And I think that is a large part of this.
And then from there that kind of gives bad actors
a space to push different kinds of agendas and different

(09:57):
kinds of information and also in a way, almost again
like justify the doxing of different activists, right. And so
I think it's sort of hard to like definitively say,
but I do see on on like the you know,

(10:17):
on the like writer or conservative end uh, this sort
of lack of nuance and inability to understand like a
human rights conflict really muddying the waters in terms of
also how do you contextualize or talk about let's say,
like student protesters.

Speaker 1 (10:35):
So we talked a lot about like media entities. I
guess I'll say, who are known provocateurs that like nobody
should trust. But I do think that national reporting is
not all it's not I have not seen a lot
of national reporting that is super clear about what is
going on, like the fact that we're talking about nineteen
like a nineteen year old college freshman or something being

(10:58):
up against a coordinated, organized, well funded institution with a
mission to dox them and make them look bad, right,
and so like, I don't feel like we are really
getting a lot of opportunities to have that full story told.
I think that the way that you have described it, Caroline,
as like we just don't live in a harassment literate

(11:20):
society where you know the actual context of what that
would be like and what and like why that is
happening is not being reflected perhaps, And I guess I wonder, like,
how can we get to a place where that's that
story is being told more honestly, where people really understand
that like, yeah, you're talking about a college student for
who and then a billionaire funds a truck with their

(11:43):
picture and their name on it to drive around their campus,
right or like, like, how truly fucked up? That really is.

Speaker 4 (11:49):
Totally and thank you so much for that. I I
don't know how we get towards a harassment literate society.
That's why I'm really grateful for podcasts like No Girls
on the So I've been really grateful for a lot
of the programming we've seen across different kinds of civil
society organizations. Been extremely grateful for teen Vogue and it's

(12:10):
like existence over the past few years. I think a
lot of this is really sort of insisting on having
nuanced conversations. But I think a lot of it also
actually does come down to different kinds of education, and
I don't know at what level that should should happen

(12:30):
or exist in Like, for example, with our organization, we
don't really work a lot with children for a variety
of reasons. Some of that is also just like regulatory reasons.
It's not one of our advocacy focuses because children are
like under eighteen, they're just a different legal group than adults.
But you know, I think there's something to be said
around how do we how do we sort of talk

(12:55):
about things like digital harm and also offline harm. But
I think that this is also an issue that adults
run into as well, Like you can't necessarily summarize, I
think often or I think it's difficult to summarize a
lot of harassment cases into a tweet, right, to summarize
it in a few characters. And I think in this

(13:17):
sort of space of us trying to maybe editorialize or
in you know, take a stance on something that often
really collapses the nuance of what we're talking about. I
also think that there's like a binary in which people
tend to approach good and bad, and then that sort

(13:39):
of weighs into harrassment. I think that there's also a
deeper context here. So this is one of the things
I've noticed in some of the trainings we do with
folks that are like victims that are facing harrassment or
have been harassed. And I've also noted this with folks
that are, let's say, into being friends of her assers.
It can be very stabilizing and difficult to sort of

(14:02):
come to terms with that someone you care about might
harm someone else. Folks often I've found in my experience,
So this is like in my observations, like they'll sort
of reflect on their own personal feelings about it, right,
and what that says about them. And it's one of

(14:23):
those things where I like to say, harassment isn't complicated,
it's your feelings about the harassmer that are complicated, right, Like,
those are the complexities. So if you're a fan of
Johnny Depp, it's probably very difficult to reconcile your fandom
and like this thing you've invested in, even with clear
evidence that he has harmed someone, right, or if someone

(14:47):
is your friend. So then what I've found is people
try to look for logical reasons, like, oh, what did
the victim do? There must be a reason that this
thing or person or institution or country I really like
is doing this. There has to be a logical reason.
And one of the things I like to point out
to is like sometimes there's not. It's like the bitch

(15:10):
eating crackers meme. Someone just might not like that other
person and that's all that it takes, and they happen
to like you. And that is sometimes just how harassment unfolds.

Speaker 5 (15:19):
Right.

Speaker 4 (15:20):
Other times people are seeking out a weaker opponent to
feel stronger. You know, there's all different reasons. My work
in particular doesn't really focus on the psychology of the
harasser or why they harass. That doesn't necessarily help me
help victims. What helps me help victims is often looking
at like how they're being targeted, how are systems being

(15:43):
subverted to harm them, how do we improve those systems,
and how do we also support victims in a trauma
informed way. So I think this is where we're sort
of seeing this almost at scale. I think the complexities
here are also in terms of policy, they get very complex.
I would say in terms of is it a genocide

(16:05):
or not complex it is a genocide. But I think
this is where people also don't understand international relations or
international politics, I e. That the people of Gaza are
not Hamas, that they don't understand how like aid work works,
that they don't understand different rules of law and rules
of war. In terms of also how or how journalists

(16:28):
and the media are allowed to engage in document how
different spaces or sites are targeted or not targeted, right,
how aid and food is allowed to flow in and out.
Like these are we work in human rights. These are
even areas that are complex to me because I don't
work in like war torn areas, right, So these are

(16:50):
also very complex systems to myself as someone who is
a human rights expert. And so I think there's no
space right now with how Internet conversation has been defined
or how we're engaging in it to be able to
have a conversation. I can't give you a one tweet
description or summary of this, like I have to give
you a long essay. And then this also butts up

(17:13):
against you, know, then different kinds of miss and disinformation
that I think build on what Whitney Phillips calls people's
deep memetic framework, so beliefs that you have and then
you see those reflected in other spaces. So if you've
ever like I'm from Louisiana, I very distinctly remember Hurricane Katrina.

(17:34):
You know, I can understand how certain folks in New
Orleans and Louisiana and Mississippi, where my mom's family is from,
like might have a slight distrust of disaster relief programs
or trusting the government to like handle disaster relief, right
because people have lived through many hurricanes and not seen
that go very well. So, like that's a deep memetic

(17:55):
frame of them saying like, oh, can I trust an
international body?

Speaker 5 (18:00):
Right?

Speaker 4 (18:00):
And while those are different things, you can see how
some of that mistrust gets sown and how like missin
disinformation can sort of exacerbate that at scale. So then
how do you have a conversation with someone who's had
lets of negative experiences with FEMA to then be like, well,
here's why actually you should trust like this you and
organization that you've never heard of that only operates in

(18:24):
this one region. It becomes actually very difficult to have
those conversations. And I think this is what I mean,
like these are the ripple effects of not living in
a harassment literate society, is that we can't look at
things just through a good and bad binary. You have
to look at the context. You have to look at
impact versus intent, So like what is the impact of

(18:45):
someone's actions versus like what they say their intent is.
You have to also look at I think, or I think,
really analyze and think very deeply about one's own ego
in these situations, and like the seeking of justice or
like retribution, that that sometimes results in more and more
and more harm. And this is also very complex in

(19:08):
which you want to allow a space for victims to
be able to vocalize anger and distrust and dissent, like
they're like, you're allowed to be angry. This also then
conflates or combats our idea very like I would say
global Northwestern American idea of like what a victim is.
There isn't one kind of victim. There's so many different

(19:29):
kinds of victims. There's no one way to embody victimhood.
And I think that's also a complexity people don't understand,
which is how we then get into I don't know
if I believe that person or I would have done
this in this situation, And the answer is like, you
don't know what you would do in that situation until
you're in that situation. And so that's also a very

(19:51):
strange and skewed way to view how an individual or
a group of people are responding to trauma.

Speaker 5 (20:03):
Let's take quick break at our back.

Speaker 1 (20:16):
It reminds me so much of what we saw around
gamer Gate in some ways, and just this idea that
it's so much about, like what's happening is maybe not
so complex. The feelings surrounding it might be the complex thing.
And then the bit where we then don't have a
media that is informed enough or harassment literate enough to

(20:40):
really lay out what's happening, right, And so they try
to find a way of like reporting along those binaries
of like well maybe they maybe they they really do
just care about ethics and game journalism, and like maybe
this woman really did deserve it and have it coming,
and maybe you know, maybe women have taken things too
are in their criticism of gaming or whatever. Rather than

(21:03):
actually laying out like the sort of complexities that you
did and the nuance that you just did, it's so
much easier to have it be this binary, simple thing.
And I almost wonder if they're if they're like giving
us what we want in a kind of way, because
we want simple answers. Sometimes we want like oh bad guy,
good guy, or here's the reason they did that, and

(21:23):
what they sometimes what we don't want is like, oh,
well it's actually a really complicated situation totally.

Speaker 4 (21:29):
I mean, this is something I see where I talk
about a lot in my coaching with different victims of harassment,
is that one of the one of the very unsatisfying
things that comes out of of harassment. There's many unsatisfying things.
There's many awful things. There's many harmful things. Is that
you might never get a why, and the why might

(21:51):
be something that feels simplistic or minimal or naive, or
you know, not even very well thought out. And one
of the things I try to really coach people through
is it might also not be worth thinking about why,
Like it might actually, you know, it is your being targeted,

(22:14):
but it might not have anything to do with you.
And in terms of seeking the why, like why this
person is doing this, that that might not give you
the peace of mind or the closure that you're looking for.
And I think that's also I think that's really hard
for people. And I know that, like when I have
faced harassment, sometimes that's even hard for me, even though

(22:36):
I know that, like I know that if someone were
to tell me why, it wouldn't be it wouldn't be
the answer that I'm looking for. It maybe wouldn't solve
any of the conflicting feelings I'm feeling about that particular instance.
And I think, you know, this is where it is
sometimes important to separate harrassment research and harassment like literature

(22:57):
from let's say international politics and war and conflict. But
I do think it is at times and important to
sort of understand, Like the why can be incredibly simplistic
and heavy quotes, like I think a lot about my
partner's family. He was born in nineteen ninety one in

(23:18):
the Western Balkans, when Yugoslavia was breaking apart, and like,
what are the whys there? There's a lot of whys
that Yugoslavia was breaking apart. There's a lot of whys
in which a group different groups were targeted, and a
lot of those whys are not satisfying, you know, because
it comes down to hate and discrimination and Islamophobia and

(23:41):
like very entrenched like ethnic ties and views of a
particular region. And that's an overly simplistic way to even
look right at that region. But that's also a why,
and that doesn't explain or excuse anything. And I also
don't know if that improves what what it feels like

(24:01):
to have lived through that. That's just an observation, right,
And I think sometimes you know, some like sometimes the
most awful things that happen there is a simple answer again,
which is unsatisfying, which is just we live in a
society that is full of bias and hate and power differentials.

Speaker 1 (24:22):
True, but I could see how that's not the most
satisfying answer, particularly if you've been in a situation that
has like upended your life, right Like if you're like, oh,
someone got a picture of me at a protest and
now I am unable to live a normal life. Having
the answer be like, oh, well, we live in a
society that's full of hate that people somebody identified you

(24:45):
as an easy target to pick on, and that's what
they did. I could see how that could be unsatisfying.
But it goes back to what you said about sometimes
the why is not necessarily worth dwelling on totally.

Speaker 4 (24:57):
And this is something where I'd like to say, like
that also doesn't excuse the harm you're facing, Like like
you shouldn't be facing that, right, Like that's something that
no one should be exacting or putting on to you
for you to experience, which I think is why Sam
and I really do focus on victims their care. What
are the ways in which systems, inadvertently or you know,

(25:21):
via their own design, create space, create flaws and vulnerabilities
and spaces for harm because I don't need to know
the why to focus on the victim. We don't need
to know the why to support someone and even for us,
I think, even if I were to put on my
research hat, the why isn't also a satisfying thing for

(25:42):
me to seek personally, even in a research context.

Speaker 1 (25:47):
So you might be listening to this and think that
it sounds like I'm trying to dissuade you from showing
up to a protest irl because you could find yourself
a target for this kind of harassment surveillance and criminalization.
But Caroline and Sam don't see it that way. In fact,
they say that you should still make your voice heard
while also having a good understanding of the realities of
our surveillance landscape and what's at stake. When I reached out,

(26:10):
I thought this conversation was going to be a lot
more simple that it ended up being. And I think
that is reflective of the need to talk about this
whole conversation in a way that is surveillance literate, right,
that like really takes into account the full sense of
like why experts like yourselves recommend make certain recommendations to people,
how it connects to a larger criminalization and surveillance landscape

(26:35):
when it comes to things like protest in descent, And
so I wonder, like do you think we live in
a surveillance literate society, and do you think people have
a good sense of like how these things are all connected.

Speaker 5 (26:48):
Oh?

Speaker 3 (26:49):
Absolutely, absolutely not. We do not live in a surveillance
literate society. I mean there's just the capabilities of statent
on state actors is is not really completely understood. I
feel like people are oftentimes protesters or activists, shall we say,
are oftentimes operating with a knowledge base that's maybe like five, ten,

(27:11):
fifteen years even sometimes backdated. So or sometimes people will
just be like, oh, well, the adversary maybe the state
we'll say that's the adversary in this example is so
advanced that why should I even bother? Like they can
read everything that I'm doing, they can read my mind.

Speaker 2 (27:32):
Basically, They're just that capable. Not the case.

Speaker 3 (27:38):
The opposition is oftentimes not as sophisticated as we think.
And there are things that we can implement best practices
in digital security and privacy that we can implement that
will keep us safe er. And it's like super simple

(27:59):
stuff you signal with disappearing messages for communications, like don't
just use text messages or phone calls, like super simple
stuff like that will keep people safer, like you said
in the beginning of this podcast, like don't have a
face unlock on your phone or fingerprint unlock on your phone,

(28:20):
because the cops, if you are arrested, can just take
the phone and hold it up to your face and
then unlock your phone.

Speaker 2 (28:27):
Stuff like that. But backing up a little to surveillance.

Speaker 3 (28:32):
You know, the capacity for surveillance is really increased with
the proliferation of AI assisted facial recognition or you know
HD cameras that can do like a thousand X zoom
and still have like four K quality, or even like

(28:56):
we could talk about all the different optics that are
on like helicopter cameras, police helicopter cameras that can like
see in thermal, in infrared, in different heat signature patterns
in you know, movement outlines and stuff like that, to
where they can also like zoom in to where they

(29:16):
can essentially see a protesters screen on their phone from
a helicopter.

Speaker 2 (29:21):
That type of.

Speaker 3 (29:21):
Stuff, or just like super stuff that's been around for
like decades, like police poll cams that they'll just like
stick somewhere and people just won't notice, or applars, automatic
license plate readers to kind of map, like where people
are driving around all of these things. They exist, they're
in the real world, and they're used by law enforcement

(29:45):
on a daily basis almost And this is just like
the physical world that we're talking about.

Speaker 2 (29:49):
This isn't even the digital world where we're.

Speaker 3 (29:52):
Talking about, like them having the capabilities of seeing all
of your social media accounts interlinked and essentially pulled together
in one little package deal that is maybe like this
is Sam, Like this is his Facebook, this is his Instagram,
this is his Tumblr, this is his MySpace from fifteen

(30:13):
years ago. They have these capabilities, but we can do
things to prevent them from having fuller access to them,
and we can make things private. We can have burner accounts,
burner emails, burner phone numbers, things that aren't maybe directly
associated with us.

Speaker 2 (30:33):
And there again this.

Speaker 3 (30:35):
Does boil back down to best practices in digital security
and privacy.

Speaker 1 (30:40):
Yeah, I have heard the feeling like, oh, the state
they have everything about me? Why bother? I've heard that
described as like a kind of nihilism, like surveillance nihilism,
where you're like it doesn't even matter anymore. I like,
I get it, because the list of ways the state
surveils US is vast and I could see, I could

(31:02):
I totally understand that as a reaction, but Sam, you're
so right like it. There are still things we could do,
basic steps that we could take to maintain privacy, and
I think the state, it kind of like, is counting
on us being like, oh, well, there's no point of
doing any of this. I may as well just leave
my phone unlocked. I may as well just like not

(31:22):
wear a mask. I may as well just whatever, because
they already you know, they already have everything.

Speaker 5 (31:27):
It is.

Speaker 1 (31:27):
It is, there are things you can do, and the
state totally benefits from us thinking that that's not true,
that there's nothing we can do, despite the fact that
I do understand that as a reaction totally.

Speaker 4 (31:38):
And I think there's something else also here too, where
I want to highlight where I'm about to say sounds contradictory,
and I'm going to try my best to say it
in a way with nuance lay it on us. The
state is very powerful, and I think also at times
people overestimate or don't totally understand how the tools of

(32:00):
the state work, so they have a lot of data
on us. But also like there's a lot of things
that surveillance technology can't do that. I think often people
misunderstand that it can do. And I think that isn't
That isn't to say we should all take a sigh
of relief, you know. I think we should all be

(32:21):
like breathing in and out constantly, you know, trying to
quell different levels of anxiety. And I say, this is
a highly anxious person.

Speaker 5 (32:29):
Same.

Speaker 4 (32:31):
But but one of the things I think it is important,
and I'm thinking about this in like a protest context,
is you know, like the state has a lot of
data on a lot of people, as we learned from
from like Edward Snowden and and like the prison program
for example. But also there's like so much data that

(32:51):
at times it can be hard to identify people. And
that's the other end of security nihilism that I see
is people being like, I'm just a drop of data
in a very large data bucket. And for me, it's like, well,
you never know when the state is going to decide
to look at you, And if you have any marginalized identity,
then you know already that bad actors of the state,

(33:13):
like a local police force, they don't need very much
to decide to look at you. In fact, they need
almost nothing to decide to arrest you to like, you know,
fuck up aspects of your life, et cetera. And that's
only getting more and more worse now, especially in the
United States. If we look at the overturning of Roe v. Wade,
of attacks on gender affirming care, like it's so difficult

(33:36):
to sort of just do things we should be able
to do. One of the things I want to highlight
this is something I think we've seen a little bit
in our training, is people sort of at times sort
of assuming that let's say, like police surveillance is magical,
or that like AI is magical. It's not. There are

(33:56):
things we actually can do to be safe. And I
think that's like what I want to sort of emphasize.
This is like the space of nuance, Like we should
be afraid, and also there are things we can do
to try to like mitigate and reduce these harms and
be safer, and we should do those things. We should
especially do those things, and we should do those things

(34:17):
if we're also in community with marginalized community members. So
like I'm a white, non binary person, you know, I
was born in the United States. For me, going to
a protest unmasked even though it's COVID, So actually you
should all be wearing masks out there anyway. But like

(34:37):
if I were to go hypothetically unmasked, which I wouldn't do,
Like me getting arrested is very different than someone else
getting arrested. But I shouldn't be worried just about me.
I should be worried about the people I'm standing with, right,
And I should be worried about how I've saved those
people in my phone, And I should be worried about
what kinds of things can be done with just a

(34:58):
little bit of like nudging from the police.

Speaker 5 (35:02):
Right.

Speaker 4 (35:03):
And this is where understanding security measures and privacy measures
and having good what Matt Mitchell calls like security hygiene
and digital hygiene is really important. Right, So like disappearing messages,
using signal, not bringing my phone at all to a protest,
these are really helpful things versus going to a protest

(35:25):
and thinking you need like a farity bag and a
mic jammer and all these things. But then let's say
you're connecting to public Wi Fi and you're messaging and
you're posting on Twitter, like that has almost negated the
other things you also don't necessarily need something like a
mic jammer if you like, you know, if you don't

(35:46):
have malware on your phone.

Speaker 2 (35:47):
Right.

Speaker 4 (35:47):
So I think that there's these things where people sometimes
sort of over estimate the capabilities of some of the
tools that the police have and then underestimate how little
evidence the state needs to seize your devices or be
able to engage with with your phone. And so I
think it's this balance of understanding where it does like

(36:10):
security on the device stop, and like good protest tactics start,
like my good physical security, right, And I think these
are very much intertwined. And I think that's also another
level of complexity we've been dealing with, Like we've also
worked with activists who are really nervous and like want

(36:32):
to put their phone in the freezer or the microwave.
And you know, that's something too where it's like I
totally hear that you're scared if you want to do that,
Like I can understand how that makes you feel safer,
But the issue more is like, let's talk about what
would be happening on your phone that would cause that
to happen, and also that there's an even more serious

(36:57):
question of that means there's something on your phone that
is the listening in all other different instances, and so
if you're really worried about that, then the problem is
something we need to look at on the phone.

Speaker 2 (37:08):
Right.

Speaker 4 (37:08):
The problem is spyware or malware on your phone, right,
And that's something we need to deal with immediately. And
I think that's also something that I think is very
difficult to understand or work through if if you aren't
a technology expert. And that sucks, because you know, we

(37:29):
want people to have the most accurate and like salient
information as possible, and it I think a lot of
barriers to this is having to be constantly updated with
how technology works, what are the different kinds of tools
and apparatuses of the state, and then how do you
get that information into people's hands.

Speaker 1 (37:54):
More after a quick break, let's get right back into it. Yeah,
I'm almost cringing a little bit because I definitely went
to a meeting where we all put our phones in

(38:15):
the freezer because somebody saw it on the Snowden Doc.
And then I definitely like logged onto Starbucks WiFi. I
had a whole conversation about what we talked about via
Starbucks WiFi, via just SMS messengers like oh well, like
the more like it felt cool to put our phones
in the freezer, but I took no other security steps

(38:36):
that day, and in fact did things that were like
risky that you know, a little common sense, basic considerations
would have probably been more effective or made a bigger
impact on my security or my made a bigger impact
on my security than like, let's all put our phones
in the freezer. And I think, Caroline, you made a

(38:56):
point about people not being tech expert. I wonder, you know,
in twenty twenty four, with the ubiquity of smartphones and
facial recognition technology and doxing and like all of these
technological innovations and advancements, do you think that folks feel
a barrier to protest or to speak up and use

(39:17):
their voices because it seems like, well, this is so much,
this is so opaque. Some of the guides and way
that people talk about technology and security feels not accessible.
So I'm just gonna like not show up because who
has the time? Like I wonder, is that something that
you've seen in your work?

Speaker 4 (39:33):
So I haven't personally seen that, And some of that
I will highlight might be from where we sit as
an organization because we tend to work with other community organizations.
We tend to work with human rights defenders, journalists, activists,
and then members of the general public who like want
to engage with this knowledge. So I would say that like,

(39:56):
I haven't seen necessarily a deterrence. What I have seen
is people showing up but still being scared and or
showing up and then something bad happens and they weren't prepared,
let's say, for for for what was happening, and that
I want to highlight that's not there. That's not their fault,
Like we live under surveillance capitalism, how software and hardware

(40:20):
has been designed is not the fault of the user
or the vulnerable individual. That's the fault of capitalism and
big tech. And like a lot of not regulation we
have in the United States, right, and so I think
luckily at least you know, Again, also we're speaking from
a very specific sort of space convocation, our lab, in

(40:42):
which we are often engaging with people that are you know,
human rights defenders, journalists, activists, community organizations. So we haven't
seen a hesitation. But what we have seen is people
recognizing that there are like skills and tools that they
that they don't have, and they're seeing that also in

(41:03):
real time. I think right now because of the amount
of doxing attempts that are happening on social media, and
so I think that's weighing into our people are now
a bit more aware. You know, there's been many stories,
as you know, Bridges, as you pointed out, with billion
paires renting buses and putting people's faces on them or

(41:23):
putting people's names on them. And I think that is,
you know, causing folks to reflect and be like, oh,
like that could be me, and I still need to
show up. I still want to show up, but how
do I how do I create or maintain some safety
knowing that that is a potential outcome. So that's a
little bit more of I think what we've been seeing.

Speaker 1 (41:43):
I started this conversation wanting to use the focus or
framing of like campus protesters, campus activists, and what's happening
on campus is right now, But how have you seen
these same tactics being used to target anybody, like people
who are not necessarily protesters, who maybe work for companies
that are like deemed to woke or have in some
way have been like perceived as ideologically against the bad

(42:08):
actors who are doing the dosing, Like, is this the
kind of threat that really all of us might need
to be aware of, whether or not we've ever set
foot on an IRL protest on a campus.

Speaker 4 (42:18):
I would say so. I would say, with how you know,
regardless of the country you live in, with how politics
have been going over the past few years, it is
always good. It's very important to be thinking about your
own digital footprint, data that's out there about you, and

(42:39):
how it can be used or misused to harm you.
I think that that's something we all now especially really
need to be thinking about.

Speaker 1 (42:49):
I don't want people to listen to this episode and
think every threat that we have discussed is something that
they personally like, will be are likely to become a
target for. Like, I don't want people to be paranoid.
I want people to be informed and smart, right, And
I guess, like, in your work, how have you prepared
people to understand like they're specific their specific needs in

(43:12):
this whole conversation.

Speaker 4 (43:14):
Oh gosh, I think this is one of the hardest things.
This is where this is why we do I think
really targeted workshops. But Samya, do you want to do
you want to.

Speaker 2 (43:21):
Weigh in the answer is threat modeling.

Speaker 4 (43:25):
Oh, you were going to say that, I was going
to say that, Sam is going to say it.

Speaker 3 (43:31):
Well, we all have different threat models, right, and what
Carl's threat model is going to be way different than mine,
and what bridgets is is going to be way different
than Carlos. Threat modeling is the process of trying to
figure out the possibility or probability of a threat that

(43:52):
the person may encounter and weighing that weighing the consequences
of that threat on how realistic it is that it
will happen or will not happen. And I think that
people can try to figure out what their personal threat

(44:13):
model is by considering who the bad actors that they
may encounter are and what their capabilities are, and then
what they're trying to protect. So that might maybe student
protesters that want to keep their anonymity, and so maybe
they will not bring their phone to a protest, or

(44:35):
maybe they will wear a mask or cover their tattoos
in order to not be identified physically. So that's just
like one example of a of a threat posed to
campus activists. And this threat modeling process can really be

(44:59):
used for all aspects of life. You can threat model
everything in your daily life.

Speaker 4 (45:05):
I mean, I just to build on that. Like one
thing I want to highlight is we all threat model
every day, as Sam is saying, like you can use
it in your everyday life, and you do, Like when
you decide to cross the street not at the light,
you are threat modeling right when you're sort of making
a decision around how you're going to get home or

(45:26):
go to go somewhere. That is like a form of
threat modeling. And I think I think that there, this
is I think this is what gets really tied into
then trying to understand a bit more about surveillance literacy
and like security literacy and privacy literacy is really important.
Threat modeling is incredibly important. It's how you can decide,

(45:48):
you know, it's how you can maintain some safety. What
I think is harder when the challenges is also helping
folks feel safe and secure around threat mode. And some
of that now comes down to how do we understand
like the tactics and tools of our adversaries. That's understanding

(46:09):
there why right so why it doesn't matter, but it's
understanding like what are they using? And so I think
for folks that are going to actions, please go. Please
try to wear something that sort of helps anonymize you.
So I would not recommend wearing your you know, cal

(46:32):
state the year you're graduating shirt, or like your really
awesome jacket that you made that's one of a kind.
I would recommend wearing something that's a little bit more plain.
I would really try to cover your face, cover your tattoos.

(46:52):
I would recommend leaving your phone at home if you
feel comfortable doing that, you know, I think this gets
into a different space if we're talking about folks that
are either there to observe or document. There's a lot
of great guides out there on how to document a

(47:16):
protest safely. Some of that, you know, we point to
our friends at open Archive and Witness who have really
great guides written for human rights defenders on how to
document actions and safely upload them. We recommend following that,
you know, please check out different safety guides that have

(47:36):
been put out, you know, pretty recently from the eff
or the Markup in terms of how to stay safe
at a protest. We're updating, we're updating and creating a
new anti doxing guide, and we're also putting out guides
hopefully soon. But a big thing is, you know, maybe
don't bring your phone with you, and if you have
an iPhone and you need to bring your phone, consider
turning it off, putting it lockdown, mown, and turning it off.

Speaker 5 (47:57):
Yeah.

Speaker 1 (47:58):
Well, actually hear from eff technologist who wrote that guide
next week about some concrete tips for digital security at protests.
So folks should definitely tune in, But are you all
working on any guides that folks should know about.

Speaker 4 (48:10):
One thing is like, you know, if you turn your
phone on and you've brought with you at the protest
and you're like tweeting about where you are and you're
taking a photo of where you are, that might negate
a lot of the safety tips that you've already gone through, right,
And so one thing to consider is also like can
you put that out later? Can you scrub metadata from it?

(48:33):
Are you doxing accidentally your fellow protesters? Like are they
are their faces covered? You can use Signals face blurring
tool which they have you can blur out people's faces.
That also helps strip the image of metadata. So there's
like all these different things you can do, and I
think it's worth doing those and going to the protest.

Speaker 2 (48:54):
I think some of this is.

Speaker 4 (48:55):
Also shifting our own concepts of what it means to
sort of document and protest safely.

Speaker 2 (49:02):
And I think some of that is.

Speaker 4 (49:05):
Recognizing that, like going to the protest is almost more
important than publishing an image that you were there. I
think there's other ways to talk about being there, And
that's not to discourage people from posting, but rather it's
to say threat model and really think about what's in
this image? You know, what does this image reveal?

Speaker 2 (49:26):
Right?

Speaker 4 (49:26):
What does this image reveal about me? Does it reveal
about other people? Could someone be identified? Who is you know,
who I'm protesting with? How could this negatively impact them?

Speaker 5 (49:40):
Right?

Speaker 4 (49:41):
I think it is thinking about some of those some
of those things as we engage in collective action. And
so those are like some of the tips I want
people to like think about, which is, you know, make
a plan with your friends later of when you're going
to meet up and pick a time and make sure
you know how to get there, and and if you

(50:01):
don't show up in a certain amount of time, maybe
that's a signal to them that something has happened, and
like have that conversation. And these are like really safe
ways to still go about and engage in this really necessary.
I would argue, like civic action that we need to
be engaging in, and that's just a really great way

(50:22):
to cut down on ensuring your your phone isn't, you know,
contributing to this ongoing surveillance apparatus, or just turn your
phone off and don't turn it on until it's over.

Speaker 1 (50:38):
Caroline Sam, thank you so much for being here, and
like truly thank you for your work. We need folks
like you who are making it easier and safer for
everybody to use their voices right now. So I hope
this gives people a sense of how they can do that.
Thanks for being here. If you're looking for ways to

(50:59):
support the show, check out our March store at tangody
dot com slash store. Got a story about an interesting
thing in tech, or just want to say hi, You
can reach us at Hello at teangody dot com. You
can also find transcripts for today's episode at tenggody dot com.
There Are No Girls on the Internet was created by
me Bridget tod It's a production of iHeartRadio and Unboss Creative,
edited by Joey pat. Jonathan Strickland is our executive producer.

(51:22):
Tari Harrison is our producer and sound engineer. Michael Almado
is our contributing producer. I'm your host, Bridget Todd. If
you want to help us grow, rate and review us
on Apple Podcasts. For more podcasts from iHeartRadio, check out
the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
Advertise With Us

Popular Podcasts

1. Stuff You Should Know
2. Dateline NBC

2. Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations.

3. Crime Junkie

3. Crime Junkie

If you can never get enough true crime... Congratulations, you’ve found your people.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.