All Episodes

March 2, 2020 60 mins

Who are the people who spread online disinformation? The so-called trolls you hear about in the news whose jobs are to distort facts and create chaos? Camille Francois knows them well. She’s the chief innovation officer at Graphika - a social media analytics firm hired by major companies to identify and fight online disinformation. Her team was a big part of uncovering the extent of Russian influence during the 2016 election. She spends her time in the darkest corners of the Internet taking on one of the most extraordinary digital threats of our time… But it might just be her humanity that gives her an edge.

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
First contact with Lorie Siegel is a production of Dot
Dot Dot Media and I Heart Radio. I talked to
trolls and hackers who made a living doing disinformation on
propaganda for higher people, who said, initially I joined because
I wanted to do campaign messaging for you know, from

(00:22):
my candidate to win. And then I woke up and
I was just sending rape threats to woman journalists using
fake accounts and wondered, what, you know, what happened? What
am I doing there? Who are the people who spread
online disinformation, the so called trolls that you hear about

(00:44):
in the news, whose jobs are to distort facts and
create chaos. I want to introduce you to someone who
knows them well. Camille Francois. She's a security researcher who
spends a lot of her time in the darkest corners
of the Internet. Camille is the Chief Innovation Officer at Graphica,
which is a social media analytics firm hired by major

(01:06):
companies to identify and fight online disinformation. To give you
some context, her team was a big part of helping
us uncover the extent of Russian influence during the election,
and she knows a lot about trolls. Unlike most of
the people talking about them and the damage they're causing,
she actually spends a lot of her time talking directly

(01:28):
to trolls to understand the why who they are and
why they manipulate social media to distort the truth. I'm
excited for you to get to know her. She's unsuspecting
and quick to laugh, which I actually think is pretty powerful.
To give you a sense, she just kind of roamed
right into our production studio, a New York City high

(01:51):
rise with lots of security. No one stopped her, well,
no one asked me what I was doing there, so
so stone under a to meet her. She's taking on
one of the most extraordinary threats of our time, and
it's her humanity that gives her an edge. I had
no idea when I set across from her at a

(02:12):
New York dinner party that I would leave asking myself
a question that I want to pose during this episode,
what if the key to fighting disinformation online and some
of the most alarming cyber threats coming in the future
starts with empathy. I'm Laura Siegel and this is First Contact,

(02:38):
So welcome to First Contact. I'm super excited to have
you here. I'm super excited to be here. I have
to say, I love the name of the podcast. I'm
a huge Star Trek fans, so first contact speaks to me. Oh,
I love it. I love that. And you know, um, well,
first of all, I should do kind of a quick
intro to you, and which will probably play into like
why you like Star Trek right, and just like your

(02:59):
background is really really cool and interesting. Um, You've spent
your whole career kind of at this intersection of cybersecurity,
public policy, and human rights. UM. You've worked for the
French government, advised governments UM on all sorts of policy issues,
and then you were a principal researcher. I've seen that.
Jigsaw worked at Google many times. Jigsaw, this is this

(03:21):
think tank and technology incubator within Google, and now you're
the chief innovation officer at Graphica. Right. Is that that's right?
So I mean you've essentially spent your whole career digging
into fascinating interesting issues around cybersecurity and human rights. Yeah.
I really looked out you know when I went to school,

(03:41):
I said, I think this is what I want to
do for a living. It's like the intersection of human
rights and tech and politics, and I think back then
people were laughing saying it's not a job. I think
I really looked out being able to work on these
issues for for so long. And so how we start
with our first contact. As we talk about our first contact,
our first contact happened at a dinner party, right, Um,

(04:07):
I guess we should say to our listeners, like we
were at a dinner party where no one could talk
about what they did, which is I guess this new
concept of like we're not defined by what we do,
although what you do is just the coolest thing in
the world and it's so bad us. Um. But we
set across from each other and everyone guessed. We were
sitting around a table and everyone was like, guessing what
you did? Do you remember what people were saying? Yeah,

(04:29):
it was really fun. So for the dinner, I had
brought a bottle of natural wine, biodynamic wine, and that's
kind of the only thing that that, you know, people
knew about me, and so they were like, well, you
obviously you work in the wine world, and I think
I get that because I'm French. And then some people
were like, no, she works in sustainability, which frankly is
kind of true, given what I actually do. I work

(04:50):
on the sustainability of our conversations online. I don't think
this is where they were going with it, but yeah,
that's that's what they said. I don't know what what
you got. What did people say about you? There's this
weird guy next to me that kept saying everyone was
a dancer. Do you have this story? It was super weird. Um,
I thought you were doing something super specific. I'll come
back to me. I thought you were a diplomat or
something like that, Like I had something super specific for you.

(05:12):
I will totally take that. And I'm forgetting what I said,
but I think I said managed chaos or something. But
I mean, what people didn't really realize is like, sitting
around this small dinner table was like someone who is
at the forefront of fighting disinformation in nation States and
who works in the dark corners of the web and
sees the craziest stuff ever. And I don't know if

(05:33):
this is like the right thing to say, because like
I hate when dudes do this, but like you do
seem really unsuspecting, right, Like you know, like we were
just joking about how you kind of social engineered your
way into this building, like we didn't come get you
in time, and like you kind of go and ask
me what I was doing there? So right, you're like this,
you come in with this like cute shirt, that's his facts,

(05:53):
and like no one questions you. You just have this
like this ability to understand and I have I'm gonna
go somewhere with this. Um, you have this extraordinarily human
quality about you. And I think that's what a lot
of people picked up on at the dinner table. And
I think that might be if I could say anything,
what is kind of your competitive edge when it comes
to what you do for your day job? Yeah, you know,

(06:15):
I think, Um, I think people don't realize that empathy
today is a critical component of cybersecurity. A lot of
people are thought, in order to do cybersecurity right, you
need to be really good at like managing tubes, breaking tubes,
you know, managing networks. I think today we realized that
if you want to secure systems, secure conversations, secure the

(06:38):
way we live online, you really have to have an
empathic heart and to understand the types of threats and
how they evolve. Like, so, can we get really quick
into because I want to go back to your background,
but just so people know, Like, what is your day
to day, Like, I just envisioned you in an office
like fighting the troll farms online and making sure democracy

(06:59):
is ruined, But I don't know what that actually looks like.
So's I know, big tech companies call on you to
help with these different campaigns, and um, you've had some
pretty high profile reports that have helped us understand the
extent of Russian influence, But like, what is your day
to day look like? Well, you know, first, I don't
do this by myself. I'm very lucky that I get
to manage a really fantastic team. Yeah, people coming from

(07:23):
different types of background, and so together we analyze online
conversations and we look for markers that they're being manipulated. Now,
it's really fun in is. You know, we tend to
think when conversations are manipulated it's either trolls or bots
or Russians, But today there's there's quite a big diversity

(07:43):
of actors and ways in which people manipulate public conversations.
And so we investigate because we want to be able
to have more people do that and to have a
public that's better informed and better equipped to tackle these threats.
We also build tools to make that easier. So we
have an R and D lab. They just do scientific

(08:04):
research on how can we better detect patterns of manipulation
of aline conversations, so like you can essentially see it coming,
understand if people are being manipulated, understand kind of who's
behind it, and try to stop it. So I'm gonna
put it in this way of how I love to
look at the hacking community and security researchers, and I
got really fascinated in the security community years ago. I

(08:26):
went to black Hat and Defcon for the first time,
and for our listeners that those are like hacker conferences
in Las Vegas. So if you could just imagine a
bunch of like security researchers, like who are finding vulnerabilities
online and like and weird Vegas hotel room. I mean,
it's just such an interesting community. And I remember my
first exposure to it, I was like, WHOA, these people

(08:48):
are like these modern day superheroes, right, Like they have
the ability to fight the bad guys online and they
have this skill like they can as we talk about
social engineering and you being able to kind of walk
in here and and understand how computer systems work. Like
you could use this skill for bad or for good? Right,
and so I remember going to black Kind and Decon
for the first time and everyone had code names, right

(09:10):
do you? I mean right? Like is that a thing
in the in your community? Like that you say that
because last year I was at black Hat with Bert
Schneier and Eva Calpern and our panel was please use
your skills for good. I mean like back like six
or seven, eight years ago. It was like, even when
I started going to these hacker conferences, it was like
if you found a vulnerability or if you started saying, hey,

(09:33):
this isn't good, you could get into trouble for saying it,
right and so, and there was something, um, there was
something very anti establishment about it and being that person
who was fighting the bad guys and who was calling
attention to this and who was kind of saying this.
I mean, do you have a code name? No? I
have a I mean yes, and no it's not the

(09:53):
greatest code name. I go back kim Tronics. Okay, So
like yeah, well, I mean like it's it's not the
most secret code name. Yeah, that's kind of cool, though,
what does it mean? I'm trying to Yeah, like, what's
behind what was behind it? Um, you know, I can't
even remember. It's been with me for a little while.
It followed me around. Um yeah, that's interesting. Um right,

(10:15):
So but there was I guess what I'm getting at
is there was always the spirit of the community that
was had a lot of conviction and wanted to do
something and raise awareness. And so now we're in a
certain moment, and so I guess I'm curious about your background,
like before we get into like all the stuff you're
doing to help fight kind of this this current moment
in time. I'd be curious to know just like how

(10:36):
did you get into all this? And just today in
a second, but I'm really interested in what you're saying
about how this idea that initially was quite scary, right,
let's enable a bunch of kids to poke at the
systems from the outside and find the holes in it
kind of became our best idea for how to do
security in a complex world. Right. And I think what's

(10:58):
really interesting is that model is going to apply to
more and more questions. So I think a lot about
biases and algorithm Right, So when machine learning makes decisions
that are deeply racist or deep sexist, right, and all
those problems that we're seeing emerge and that frankly, we
don't have a lot of solutions to tackle. Could the

(11:22):
model of the bias bounty also be applied there? Right?
How many more learnings can we draw from the hacker
mindset and from the security community how we do security
in an age where we have more complex problem with technologies? Right?
I think there's a lot of innovation and promising areas
to to explore there. That's interesting. So tell me about

(11:44):
how you got into this. Were you always kind of
a free spirit? I mean who answers no to that question?
So you know, like, um, no, I I don't know
how I really got into this. As a real answer,
I think I was always quite obsessed with technology. I
grew up in Frience, and very quickly I realized that

(12:05):
the type of questions that I wanted to ask I
needed to you know, to go to the US too
to ask and to work on. But like taking back there,
like were you were you tinkering? Were you like playing
on the internet? Were you in weird chat? Yeah? It
was a very optimistic person. And as to them, you know,
like my team often says, I'm the most optimistic person

(12:26):
looking at like the darkest stuff. I was really excited
by the promises of the Internet. I really thought that
the Internet was going to bring democracy, was going to
bring more diversity, was going to connect connect us to
one another. I was I was just really excited about
the promise of the Internet. Perhaps it's a generational thing.

(12:48):
Was just like, was it just you love no spirit
of it? Yeah? I love the spirit of it. I
loved it. I remember, you know, when my father got
me my first computer, I just thought it was magic.
And so I think the first you know, projects that
I worked on were just so extraordinarily optimistic. So I
remember with a close friend of mine, we had this

(13:10):
project called Citizen WiFi, and we were knocking on doors
around in Paris and we were asking people, could you
remove the password on your WiFi so that everyone can
connect to your WiFi and we'll give you a little
sticker that says citizen Wifian. It's not remove the past,
where it's like create you know, created past network so
that more people can access the Internet and you're part
of the guest WiFi community, which you know, again, it

(13:32):
was very extraordinarily optimistic idea of just like let's put
more people online and things are going to be great. Um.
So yeah, I think I really come from the background
that looks at the Internet with very rosy, optimistic, cheerful eyes,
which I still do. Wait, after all the crazy stuff

(13:52):
I see, I still you know, I still I still
have hopes for what you know, digital technologies can bring
this well. I also saw you wanted to be a
space baker, which I don't even know what that what
what on earth does that mean? Because we're going to
get into Russian influence and fighting the bad guys. But
I feel like we have to we have to start
at some point with space baker, you know, like Knee

(14:13):
Lakes and Voyager or you know, Quarkom Deep Space nine.
Just just have like a little bakery or bar and
you're on a space station and you made cross song.
You know, it's kind of nice. You meet people from everywhere.
You're like, Hey, which quadrant are you from? Do you
want to cross on today? Like just like a nice
little space baker. Um. By the way, no, no, no no,

(14:36):
it's not that I'm like speeches. I'm just like, wow,
what an interesting dream? Like I wanted to be a
Broadway actress. Unfortunately that you know, I didn't really live
into that provision, but I wish I were cool enough
that that was my dream to do that. I love that. Um.
And so you ended up going and getting a degree
in human rights and international security. Yeah, that was that
was far from the space baking. Yeah, like the space

(14:57):
baking route was not straightforward. Was there was there a
certain moment that you were that that that dream was crushed?
Was there any any indicator that wasn't going to be it? No,
I'm just really bad at cooking, totally, totally. You're unsafe.
You're at a safe space. Yeah, I am just so
bad at it. Well it's good because the things you're
good at we really need right now, which is fighting

(15:19):
really bad guys online. Um. So you ended up getting
a degree in human rights and international security and you
ended up going to DARPA eventually. Can you tell us
a little bit about your work there. Yeah, I mean
that was a long time ago. It was. It's interesting.
It was a project that I was working on at
the end of my grad studies and it was just
looking at privacy and security. So I had a fairly

(15:42):
conservative cybersecurity professor at Columbia who who called me a hippie,
and I told them, you know, I'm I'm really not
very radical hippie. And what I'm telling you about digital rights,
I really don't think it's very radical. I think it's
really something that needs to be hurt and discussed. And

(16:03):
I was telling him, like, you know, I'm really concerned
by the growing gap between national security discussions on one
end and what I considered to be important digital rights
and human rights security discussion on the other hand. And
and I think more should be done to bring digital rights,
to bring human rights at at the core of how
we discuss cybersecurity. What does that? What does that mean?

(16:23):
Can you explain that a little bit? You know, privacy?
Privacy is a good example, right, so you had a
lot of conversations on what do we need to secure
the Internet from bad guys? And none of these conversations
back back then. Right, I think we're in a much
better place now, but a few of these conversations consider
that privacy was a very important element of that, Right
it was. I think back at a time where most

(16:44):
people consider that there was a tension between privacy and security. Honestly,
I think we've done a long way since then, and
today I think there's a recognition that privacy and security
go hand in hand, right if you have a system,
you know, that leaks out private information on people, that
also is a system that's easier for hacker to exploit.
And so I was, you know, working on these these topics,

(17:06):
and my professor was like, okay, fine, well, well you know,
I know a project that could use a hippie and
then we'll you'll just get to work on this project
as a student. So, yeah, that was fun. We're going
to take a quick break to hear from our sponsors.
But when we come back, Camille talks about her work
on the front lines of terrorism online and how isis

(17:28):
was actually really good at micro targeting. Also, if you
like what you're hearing, make sure you had subscribed to
First Contact in your podcast app so you don't miss
another episode. And you also worked at Google and specifically

(17:56):
jig saw right, which is for folks who don't know,
it's kind of the I think is really an interesting
part of the company, which is where a lot of
this technology humanity, this like think tankup sorts, where a
lot of these hard problems like aim bias and some
of these efforts to counter extremism. A lot of the
people who are working on these problems are in that realm,

(18:16):
and you are kind of at the forefront of that.
So what did you find there? Yeah, it was it
was really fun working on on these shoes. Um, by
the way, I love that you say it's fun, Like
you work on like counter extremists and so like you're
like probably spending lots of time looking at isis recruiting videos. Yeah,
but you work on these issues with people who really
care about solving them and who are keenly aware of

(18:38):
the different trade offs, and that I think is a
very fortunate position. Working inside Google on this issue is
also meant like working on an organization who really wanted
to get to the bottom of it, and with colleagues Jigsaw,
but also you know, frankly old across Google and engineering
and policy who wanted to make a dent in the problem,
who are willing to experiment with creative, innovative ideas. But

(19:01):
we also had a very clear picture off the trade offs,
off the constraints of making sure that we don't go
on the other side of the line, making sure that
we protect freedom of expression as we think through these problems.
So yeah, it's a privileged position to work on these issues.
For what were some of the I mean, you worked
on a research program to counter online propaganda, So like

(19:22):
what like can you tell me, like what were the
issues that you saw? Like what did you kind of
come up with? Yeah, so it's a program that was
called the redirect Method UM, and I think what what
I was really interested in when we started um? You know,
thinking through this project is a lot of people think about, Okay,
terrorist propaganda. You have to just remove the content. That's

(19:45):
a great first step, right. Indeed, there's content that's harmful,
sometimes illegal, It shouldn't be online, and so you work
to detect it and new work to take it down.
But it doesn't you know, still the problem that you
still have issues with the questions, right, Like, there's still
users who are coming to your platforms and they're saying like, well,
I'm here to consume you know, a piece of content

(20:06):
that you've removed. And this is where the redirect method operated,
right Like, when you have users who come and who
are looking for content that is no longer there, do
you still have an opportunity to reach out to them
and to propose something else? Now you don't want to
trick them into something else, Right, So we really do
want to redirect and propose, Hey, here's a playlist of

(20:29):
alternative content that we think might be interesting. And so
you want to find, um, you want to find the
most transparent way and to do that, and you want
to find the most sort of clever way to to
to do that again avoiding old, old potential traps that
that are that are all around this this question. I mean,

(20:50):
it was such I remember covering it and thinking, this
is such an interesting idea, Like I didn't realize that
you were behind it. It really was you know, teamwork
from a lot of researchers who, as you said, like,
we're really close to the question, right. They did was
we had to sit with people and really understand when
you end up in a rabbit hole of consuming tourist propaganda,
what led you there? Right by the way, I don't

(21:12):
think people understand. Maybe this is me having done a
documentary on like someone from Isis who was was killed
in a drone strike. He was kind of a hacker
type and um, one of the things he was actually
in charge of their social media. His name was Janet Hussein,
a k A trick Maybe you've heard of him, but
like I remember he was in charge of their propagandi stuff,
and I remember like thinking like, oh my god, like

(21:34):
he's making this is gonna sound really weird to say
we might have to cut it. Like he makes ices
look very human, but he makes ISIS out to be
like punk rock, like you know these or like these
rap videos and like these videos that are so compelling
and so I'm sure you know, and and a very
human So when vulnerable people are going to to you know,

(21:56):
look at these videos like and and this is where
I think the human thing comes in. And it's like
we all sit here and we think, like, these people
are crazy, they're going to join ISIS. So these people
are crazy, they're Russian trolls or whatever it is. But
there's like a lot of humanity behind how people end
up getting into these spaces, right, and so much more subtle.
Yeah we make it up to be Yeah, you know,

(22:18):
you know who was really really good at micro targeting though, ISIS?
And what sense? What was really interesting is we tend
to think that terrorists propaganized just one big thing where
it's like one bucket of very clear graphic imagery. But
what we're observing actually is tailored narrative, targeted at very
specific community, very specific people. Right, it was you don't

(22:40):
convince the young British girl the same way then you
convince uh, you know, uh Chinese Muslims and online. It's
just much more nuanced and subtle and micro targeted picture
that that we often imagine. And so you left and
joined Graphica. And then your first assignment it seems like

(23:01):
a pretty significant one, right, you were involved in like
a super secret project for the U. S. Senate Select
Committee on Intelligence. Can you just give us the like,
don't give us the company line, like tell me, like
they came to you. What did they say, Like, what
what was the mission? You know, at that stage, I
had been working on Russian interference for for a little

(23:22):
while already, and so I was I was already pretty
you know, pretty obsessed with it. And I remember actually
when my boss called me to the CEO of Graphic
called me and say like, hey, Cam, what if like
we had all the data from you know, everything that
the Russians have been doing across platforms and and we
could really entangle and understand what's going on, Like, wouldn't

(23:45):
that be great? And it was kind of like, what's something.
John was like, yeah, John, that would be great, just
giving magic data box. Tell me just super great. And
he's like, okay, well, I think we're going to do that.
And I was like what and yeah. That was basically
the assignment. The Senate Select Intelligence Committee really wanted to

(24:06):
get to the bottom of what had happened. And I
think we don't often recognize how little we knew then,
and we still have gaps in our understanding of how
reeling this campaign unfolded into thousand and sixteen, but also
before and after, and so it was extraordinarily exciting to
be able to help the Senate who had gathered all

(24:29):
this data and really gave it to us with total
free reign. Right, they said, tell us what you see.
I think my first instinct was, you know again, as
I said, like, at this stage, I had all this
already sort in my heart, in my head, and so
I was already looking for for bits and pieces and
and it was I ra data. And so I remember

(24:50):
the first thing we did is I was like, oh, well,
here are the things I expected to see in that
data and that are not there. And so it taught
us very toically that the IRA, the Internet research agency
the troll farm that's based in Saint Petersburg was one
part of the problem, but it was not the full campaign,
and so I knew of other campaigns. Later we realized

(25:11):
that were the gru campaigns, right, who are lacking in
the data set? And so it's been like this for
seven months before the report got public, and then again
after and after. Honestly, it's you know, it's continuing to
be an endeavor, a puzzle of having to figure out
what really happened, Why were the different entities involved in
this campaign, How did the targeting take place, What is

(25:32):
the exact relationship between the hacking and the trolling and
the targeting, How did the platforms respond and um, even
more fun, how did the Russian trolls respond to the
platforms responding? And so we had all of that in
sort of millions and millions of data points. So what
does that mean, like millions of millions of data points?
Like how are you as you do the whole analysis

(25:53):
around it? Or like, yeah, we wrote this some really long,
you know, report and we tried to talk about the
big trends and everything we observed and the role of
the different platforms and how long this had been going on.
I think the few trends that we really tried to
highlight was this was not just a campaign against the US.

(26:13):
It was a campaign that had been waged against the
Russian domestic population first right against other populations in Eastern
Europe and also a little bit in Canada and in Germany.
And similarly, I think people were very focused on two
thousand sixteen, and we were able to demonstrate that it
had been happening before, right, So Project locked Out, the
big US focused project of the IRA actually started in

(26:37):
two thousand and fourteen, and in those those two years
before the election, there's a lot of fascinating detail of
the you know, the Russians really learning to to play
the Americans, right, like what are the hot buttony shoes, Like,
what are the triggers? What can we try? And so
we also looked at those two years of experimentation in
which really they are bizarre cases. Right. There's a I

(27:00):
think in two thousand fifteen, around Thanksgiving, the Russians are
trying to freak out everyone, telling people that the turkeys
they would buy at Walmart will have salmonella. Yeah, do that.
So it's just like a bunch of weird you know,
bizares else food hoaxes that are fun because of course them.
A famous case cycle called me Chemicals that's even before

(27:20):
that one is two thousand fourteen. It targets a small
community in Louisiana. This one's interesting because it involves SMS,
and so they're telling people, releasing video texting officials saying
a chemical plant has exploded, and they're trying to create
a panic. It works to some extent, as in, like
you know, it's reported a bit and then the message circulates,

(27:41):
but very quickly the local authorities say, actually that that's
a hoax and it's not true, and they kind of
move on, which, to be honest, you know, I understand.
I think in two thousand fourteen in Louisiana, if you
were to have said it's a hoax and we think
it's a Russian troll farm, I think you would have
sounded insane to any went around you. You know. But
they did things like this for for at least two

(28:02):
years before the election, and of course they continued targeting
the American public after the election. Right, So two thousand
seventeen is a really interesting year too because people are
talking about Russian trolls in two thousand seventeen rights a
new reality, and so the Russian trolls themselves are making
jokes about it. Right, So you have fake profiles that

(28:25):
start making messages saying, oh, I'm reading all these stories
about Russian trolls. That is ridiculous. Next time I'll be
accused of being a Russian trolls ha ha ha. Right,
so they kind of like adapted to the Russian troll absolutely,
and then they start adapting to platforms responding to this activity.
I worked a lot on the part of that activity
that targeted black American activists in the US, and part

(28:50):
of this effort was to create fake activist organizations and
to work with a real activists on the ground to
do events together and and to really sort of like
you know, themselves in that community. And there was a
specific group called Black Matters US. And when Facebook determined
that Black Matters US was a fake group and was

(29:10):
a Russian entity, that removed it from the platform. But
they didn't coordinate with the rest of the industry, and
so what really happened is the group went to Twitter
and started complaining about having being kicked out of Facebook,
saying we're really upset that Facebook supports white supremacist And
then they started going on Google and they bought a

(29:31):
lot of ads to redirect people to their new websites
because they had to direct the traffic away from Facebook
they had been kicked out. And so two thousand and
seventeen is just like really sort of you know, surreal
year for the Russian trolls where they playing cat anounced
with the industry who still doesn't fully have you know,
their mechanisms well set and doesn't really have their policies

(29:53):
will set either, so it's kind of a chaos and
confusion for everyone. And then the Russian trolls started talking
about Russian trolling, so it's a bit meta. And then
of course in two thousand eighteen there the mid turns.
In two thousand nineteen they were also showing a different
It's just just like kind of I think what's interesting
from my perspective is people often think that the Russian

(30:13):
campaign is one year and one thing. I've seen it
evolved over so many years and show so many different facets.
Do you when you interact with these people out of curiosity?
Like do you just sit and watch them from afar?
Do you go, do you have like an undercover name
or something where you're talking to Russian trolls as someone else, Like,
what's your deal? So I've talked to a few people

(30:37):
who have worked in troll factories Russia and Russian and others.
It's funny that you mentioned undercover, and that's not the
type of work I do. But one of the reason
we know so much about specifically the Russian Internet Research
Agency is because young Russian journalists went undercover and published

(30:57):
everything she could find, and she did that quite early.
I think what's what's interesting is an interesting reminder that that,
honestly them, the activist communities and the investigative journalist community
knew about this and really went through great pains to
document it before the rest of the world and Silicon
Valley really cared about it. You said, um as something

(31:17):
that I I thought was really interesting. You said this work
as two parts technology in one part sociology. What did
you mean by that? A lot of that is really
about understanding socio technical systems, right, So when you think
about information operation, it's not really like hacking, right, It's
it's not looking for a technical vulnerability, it's looking for

(31:38):
a social vulnerability. It's looking for what's going to play
well into society's division, what's going to full in between
two rules that a platform has and that that's going
to make them not catch me? Right, A lot of
this is really playing with social systems as much as
it is playing with technical systems. Speaking of the humanity

(32:00):
of it, you talked about kind of bringing a hacker
mindset to the data security problem, and like, what I
think is so interesting about you? I mean you wouldn't
like talk to trolls, right, Like we have this whole
miss and set. Who who are these people who are
doing this? Like in America were like, oh, the Russians
are trying to mess with democracy? And um, you actually
maybe this is just me selfishly is like a journalist

(32:22):
who loves to talk to the other side and like
loves to talk to the dark corners where people aren't
looking and hear the other side. You did that, right,
Like take me into that. So you actually found people
who were working in Russian troll farms and and talk
to them, not not just Russians. I think I've um
it was always very interested and I think you know,

(32:43):
brought that mindset to my work. I was really interested
in in understanding more from the other perspective, right, So, yeah,
I talked to trolls and and hackers who made a
living doing disinformation and propaganda for Higher Take me into
the rabbit hole? How does one decide to do that? Like,
are you sitting at your desk and you go to

(33:04):
these how do you even get in touch with these people?
Like and again it's so different? Yeah, and you know
as a journalist you get that, right, Like three stories
really different. Well that's why know, it's really probably challenging.
So you've got this is why I'm kind of sitting
here being like props, Like, how do you give it?
Give me some specific examples, like who are there any
people that really stick out to you that you spoke
to that just surprised you from I mean, they were

(33:25):
all fascinating stories. Uh. I have to say, I've heard
so many different stories that I would be I would
really struggle to paint it with one brush. Um. Things
that come to mind is I've talked to a hacker
who did propaganda for Higher all across Latin America. And
that was way before people were worried about Russian troll farms.

(33:48):
You know, it was more Yeah, the entire disinformation for
higher trolls and bots and fake profiles in Latin American politics.
That was quite fascinating. I have to to people in
what sense, like do you like why they do it?
Just from it? Just have them win an election? Right?
Like it's just like a patriotism was it just money?

(34:09):
Like it's like why do people work on political campaigns? Right?
There was there was a stick. It's like, you know,
you're assembling a political campaign, You're getting a communications specialist.
Do you want this guy who can bring you a
little army of fake profiles and bought controls. He kind
of like made a niche for himself like that. It's
pretty successful until it and it sort of badly because

(34:32):
he got caught and ended up in jail. And it's
like a yeah, that's like one story. Um, it's interesting
because that the campaigning angle came a few times, right,
So talking to people who went into doing digital campaigning
and really by by patriotism to support their candidates, right
and slowly saw the campaign apparatus evolve into like a

(34:55):
state propaganda machine after their candidate became in power. And
so there are a few, you know, a few stories
like this of people who said, initially I joined because
I wanted to do campaign messaging for you know, for
my candidate to win. And then I woke up and
and I was just standing rape threats to woman journalists
using fake accounts and wondered, what, you know, what happened?

(35:17):
What am I doing there? Someone said that to you, Wow,
what did they say? They were just just that, you know,
like that that that it slipped on that they went
in for one thing and that with the success of
the candidate and the evolution of the machinery, they ended
up just really doing something else where. Can you give
any detest the candidate that this person like it was

(35:39):
that was a story that happened in India. Um. But
but again, like I've heard that a few times, and
I think that the story of doing something for political
reasons that ends up sort of like putting you in
the middle of a machinery that's no longer what you
had had joined is one that that's a more common

(35:59):
than what we think. There's also been other researchers who
have done great ethnographic field work talking to trolls. Someone
you know specifically who comes to mind is as a
friend who wrote a report called The Architecture of Disinformation
that looks at what happens in the Philippines, and it's
really fantastic report. And he's talking to people who self

(36:19):
identify as doing this activity. They don't say trolls, right,
they don't say I'm a troll, but they say, yeah,
you know, I make my living by having a lot
of fake profiles. And if you're a candidate and you
want to pay me for this activity, I will, I
will do that. And and I think in in his
work what you see comes through is a is a

(36:42):
question on when did that become an illegitimate activity? Right, Like,
does indeed a real business of people who do this
for hire and who suddenly are told like you're a
troll and you're going to be deactivated. And I think
I think part, you know, part of what you hear
when you talk to people on the other side is okay,
wait a minute, because I've been doing that for a
little while and I thought it was okay, right, yeah,

(37:05):
I was, um, did you ever find yourself really liking
these people? YEA, talk to you, you know, it's got
you got empathy for And again, like such different trajectories, right,
like you have empathy for someone who works very candidate
and suddenly says like what am I doing here? You? Yeah?
Did you ever learn about how they learned how to

(37:26):
pose as American? Like what's the secret sauce? Like, what
is the secret sauce to posing as an American these days? Inline?
I mean, I'm sure it's changed over the last couple
of years, and it might not be rocket science, but no,
actually it's fairly complicated. So we know a lot about
how the IRA learned how to pose as an American? Right,

(37:48):
And as I said, like, this is where the early
days of the IRA are really fun, because this is
when they have to learn, right, this is why they're
playing around with like oh how much can we freak
people out by talking salmonella and turkeys around things? Right?
Like this is them trying to figure out like where
America's hot buttons. We know that we're watching House of Cards,
which I still think it's hilarious. They were watching Okay,

(38:10):
how do you know that it's in it's in a
defectory testimony? Um, But really here at the legal indictment
have a lot of sort of like crazy details on
everything that the IRA did to to sort of like
learn to be American. Right, so we know they took
field trip um. This part of how some of the

(38:31):
employees ended up being indicted was they entered the country
with tourist visas and I think a few years after
the government was like, I don't think you were here
for tourism. So we control farm field trip to America.
It's a troll farm filters to you know. Um, you know,
you observe people, you understand how how they how they

(38:53):
act when they talk about They were also looking at
their social media metrics very closely, right, so whenever they
were rying out a new post in a group, they
would take notes on what's performing, what's not performing. They
were talking about how to target specific groups and other groups.
And of course I think the thing that we tend
to forget is they were also targeting Americans, right. They

(39:15):
were talking to Americans. They were using their fake personas
to have long dialogues with American activists on all sides
of the spectrum, saying, hey, what do you think about?
What is the community? You think about how we're going
to do an event together? And so honestly, they were
doing serious research. We've got to take another quick break.

(39:42):
But when we come back. It's not just Russia using
sketchy social media tactics. Could American political candidates be using
fake accounts to win your vote? And if you have
questions about the show comments honestly anything, you can text
me on my new community number five zero three four zero.

(40:23):
Did you ever worry just because I know you're kind
of in these dark corners of like you know, dealing
with troll farms, with the g r U, like I
mean also like real like well funded governments who are
trying to influence outcomes in some of these very dangerous ways.
Did you ever worry? Maybe this is an extreme question,
but about your safety? Yeah, that comes to mind, Um

(40:47):
comes to mind? Yeah, of course, I you know, try
to be as safe as as I can. I also
don't worry about it too much because I also work
with a lot of people who are I mean, it's
it's not a ray, but you know, thinking about people
who are much greater personal risk. It also helps both
priorities in and put some relativity on it. What does

(41:09):
that mean? Um? So, someone I've worked really closely with
along the years on on on these questions is, for instance,
the amazing Mary Arressa who is the executive director of
Rappler in the Philippines, and who's a fantastic journalist. She's
been arrested so many times, she's been targeted so harshly
by her government that you know, sure, sometimes I worry

(41:31):
about my own safety, but I think more often than not,
I worry about that with my friends a bit more so.
Many of these disinformation campaigns, um, the idea is also
to silence people, and as women. I mean, I guess
it's a lot of it is also silencing women and
silencing female journalists. And what I it's it's you know,
not silencing is definitely a key goal. I'm glad that

(41:51):
you're bringing this question because besides my own safety or
that of my friends, I am really passionate about how
do we build technology to protect users from very well
founded and well resourced threats? Right, And when you think
about it, it's very difficult problem, right, Like, how what
can you do when you know that a journalist is

(42:12):
targeted by a nation state. There's a little known feature,
I mean, outside of security circle. That's a feature that's
really near and dear to my heart. It's called the
state sponsored warning. And I've been working a lot on
this and thinking of it all about this. Sometimes when
a platform knows that as a user you're being targeted,
they would actually give you a little notification that says, hey,

(42:35):
we think you're being targeted by a state actor. Why
don't you go and do this ten things right? Change
your password and able to factor authentication, etcetera. And I
think a lot about how how much we should, you know,
celebrate in the these systems. They're not much, but they're
almost the only things that exists sometimes, and how much
we should invest in making sure they're as strong and

(42:55):
robust as possible. Oh and by the way, that isn't asked.
If you've ever received a state sponsored warning in your
inbox and you have thoughts about like your experience and
want to talk about it, shoo me an email. I
love to hear those stories. By the way, First of all,
we just take a step back, like how scary would
it be if you're just like checking your email and
you get like a state sponsored war like warning. You know,

(43:15):
it's really fun because I've talked to a lot of
people over the years and it's so strange. I mean,
I think it's great, Like I love that you talk
about like doing taking on propaganda and you're like, this
is fun. And you talk about someone getting a state
sponsored warning and you're like, this is so fun. It
is horrifying, but it's fun and on the right word.
One thing. It's what's really interesting is that people have

(43:37):
such different reactions to it, right, But it's important, right,
I would rather not want to know. It's extraordinary important,
which is why, Like I have met users who tell
me like, yeah, we received this, but you know, we
think it's a drill. We think that really platforms tell
us that just to keep us on our tip of toes,
And I'm like, no, it's not a drill. Right, if
you received that warning, please, no, this is not a drill.

(43:57):
You really do have to think about your security. You
really do have to enable to back to authentication, do
those things, right. But you also have users, frankly who
for whom it's been so terrifying and often sometimes so
tragic that that this becomes um symbol for them, right. So, like,
people have very different reactions to it, and and for
some it really is sort of the you know, the

(44:20):
beginning sign of a journey that can be quite quite
horrifying and frightening and tragic. Yeah, as someone who's spent
a lot of time looking at influence in the sixteen election,
who just spends so much time like what are we missing?
I know, I watched you on c SPAN, you know,
and and research for this. I I took some time
and watched you testify, and I and I saw and

(44:42):
you know, I just I saw you talking about how
the thing we're missing is it's not just the I
R A. We're talking about the gru you and like
and how this is very well funded government backed campaigns
and we're not really talking about that. How also we're
not even measuring like private messages of people being targeted, Like,
you know, I just think there's so much talk about
one thing right now. And my concern as someone who

(45:02):
covers this kind of stuff, is that we just don't
even look at other things. Um, And we scream about
the same things a lot, which is important, but we
don't look at other things. So I thought those two
things were really interesting if you don't mind getting into
them a little bit. And then like, what's the other
stuff that you think we should be talking about the
first thing is indeed, you know, we've talked so much
about the IRA and that's great. I mean I say

(45:24):
this as someone who's very deep deep down this rabbit hole.
I would talk about the IRA day in and day
out and for a month and month without stopping. But
it's not the only actor in foreign influence. And a
lot of people when they say foreign influence, really you know,
their mental model is what the IRA did in two
thousands sixteen, which again doesn't acknowledge that there were many

(45:46):
other actors, Russian actors, right, So the gar you played
an important part. There are other Russian entities who participate
in for an interference and information operations, but of course
there's a non you know, or other governments. Right. So
the first campaign by the Iranian regime targeting US audiences

(46:08):
I think starts in two thousand ten, right there first
foreign interference on social media campaign. So a lot of
this was happening also before we kind of like woke
up to it. So there's a lot more actors than
just the IRA, and frankly than just more Russia, both
on the foreign side and also as as we talked about,
right on the domestic side. That's one thing. And yes,

(46:29):
as you said, I am worried that we still don't
have the full picture of how that specific Russian campaign worked,
and that there's still a lot that's missing from the record.
And working with activists who had been targeted, we looked
at the messages that they received, and we never talked

(46:51):
about those messages. Right, when we think about the Russian interference,
we kind of feel like, yes, that's a bunch of
tweets who had to be a little bit out of
the loop to reach to Russian and troll. This is
not what happened, right, Some people were targeted personally and
worked with fake personas for months and month and organizing
events together and discussing political life, and I think we

(47:13):
don't talk about that nearly enough. I think we're still
lacking important evidence from the record. And you know, I've
worked with activists who's whose messages have also disappeared, right,
they only have their side of the story. So trying
to piece all of this together is still I think
an important endeavor. What do you think is the biggest
threat going into ourselves? What do you mean? You know

(47:35):
this information is really important. It is true that there's
foreign interference, but um, it's been very odd to see
the pendulum swing so hard in two thousand and fifteen
when I was saying, I think there is such a
thing as patriotic trolling, right, I think governments are actually
doing these information operations on social media. I think there

(47:58):
is such a thing as Russian trolling. It was kind
of like, yeah, really, And now every time there's something,
people see Russians under their beds everywhere, right, Like everything
is disinformation, everything is foreign interference. And I don't think
that's helpful. I mean, and what about I mean on
the home front, I think, like you said something really
interesting in one of the testimony about kind of this

(48:19):
gray area of campaigns, and you said, I think because
of our lack of serious dialogue and what we're willing
to accept on social media or not, we're going to
find an increasing amount of gray area situations as we
head into candidates parties. PR firms like are we gonna
see troll farms from actual candidates? Are we allowed? Like?
What what's happening behind the scenes? Like, you know, it's

(48:41):
like two different problems, right. The first one is people
don't have a good grounding on what is normal campaigning.
I give you a specific example in the mid turns
in two thousand eighteen, there was a candidate who had
his supporters install an app and the app would view
and off token access your account and then we'll help

(49:03):
all the supporters of like tweet the same campaign message
at the same time. But you know, you would still
have to install it on your phone and you would
have to give the token to the app. Right when
that happened, people completely lost it, being like, oh my god,
look at these this is the Russian trolls. They're back.
All these messages are during the same thing at the
same time their bots, And it was really straightforward to

(49:23):
see that it was not Boss and it was not Russians,
and it was just people using a campaign app. Right,
because we actually sort of lack, you know, serious grounding
on what normal people do in the in the course
of a campaign, we're promptu overreacting, and so there's a
need for a debate on like what is okay for
a campaign to do. Is it okay for people to

(49:44):
download an app and sort of give their account to
their candidate, Is it okay to use fake accounts? Is
it okay to automate some of that activity? And on
the other side, because candidates and campaigns don't really talk
about this, you do have a lot of terrible ideas
that are floating around. I do see people who think

(50:05):
it's a great idea to have a little troll farm
set up for twenty with a lot of fake accounts
that are just going to, you know, help amplify this
or help drown that out. Like do you think candidates
now actually have like some candidates could actually have troll
farms on their own now, and knowing what we know,
do you think that there could actually be troll farms
here in the United States for candidates? Yes, any more details.

(50:32):
I'm worried that this is not a discussion that we're
having with campaigns and parties and candidates. That being said,
I think it's slowly heading in the right direction. I
was very encouraged to see Elizabeth Warren's disinformation plan that
does say to my supporters and to my campaign, these
are the things we won't do right. Doesn't get deeply

(50:55):
into the details, but I think we're going to need
more off that could she say that they wouldn't do that,
so we can pull up the details of the plan,
But it has a section of it that addresses the
type of behavior on social media that she discourages from
her supporters. I don't think that specifically talks about the
use of fake accounts, which is interesting. I think a
lot of other concepts were kind of like misunderstood. Right,

(51:18):
So bots is a is a traditionally misunderstood concept that
leads to more complex discussion that people don't really want
to have. Right, It's like, what is the role of automation?
What part of your activity can you actually automate? What
part of it is legitimate automation, what part of it
is undesirable automation? Um? Yeah, are you saying and maybe

(51:38):
you can or can't get into details, but are you
seeing like in the US, like are you seeing candidates
to people associated with candidates have bots or troll farms
or that kind of stuff? So far, I don't think
that we've seen candidates and campaigns sort of like officially
do that. What we have seen is there's a lot
of people thinking it's a good idea to use fake
profiles to do political messaging. How much is that at

(52:01):
the candidates direction that the campaigns you know? Sort of like, right,
I don't, I don't. I'm hoping that that we won't
see more of that. But again, I think a little
bit more of a clear discussion on the rules of
the road in this area would be helpful. Um. I
remember I interviewed as a askin for this podcast and
he said he thought, in the future, a threat could be, um,

(52:25):
this is my maybe very black mirror, so just go
with me and then pull me back. But saying that
in the future, a threat could be you know, a
bad actor taking using AI, taking a combination of the
faces of the five Facebook people you use the most.
Are you you talk to your act the most with
the most and targeting you with a face that you
automatically trust, almost like you you're just kind of trust

(52:48):
his face because it's a face that you're almost more
used to. Your brain just kind of registers it. Do
you think we could see something like that? Yeah. I
think the technology is already on the table for that,
which is interesting. Um. We recently did a report. We
called it f f S I think for Fake Faces
swarm was the official name, and it was a really

(53:09):
interesting report because it looked at a very large campaign
of fake profiles that use generative adversarial networks which you know,
basically like AI to create fake faces from scratch, and
so all these profiles had this fake faces that were
generated by AI, and we realized, Wow, this is something

(53:29):
that we really honestly thought was a little bit further
down in our future that we're just seeing there. But
on the other hand, the technology was there, it's very
easy to do, it's available to anyone. It was honestly
on this one, I think it was harder to detect
than to make. There's still sort of tell tale signs, right,
So something that was interesting is when you create, or

(53:50):
at least with that generation of generative adversarial networks, the
symmetry of the faces was often wrong. Right, So if
you would have an earring on your left ear, the
matching earring on the right would actually like not match
at all, right, So I've if you had a face
wearing glasses, the left branch would kind of be off
if you compare it to the left branch to the
right one. So like they were, they were tells all

(54:12):
signs like this. But still I think we're with a
lot of the AI technologies to generate these types of outputs.
Still it is still the case that it's easier to
generate them than to detect them. People argue that privacy
is kind of a blurry concept. They say, I have
nothing to hide. What do you say, Ah, there's an
entire book to be written about that. Um. Yeah, that's

(54:35):
not the point of privacy. What is the point? The
point of privacy is the preservation of society and intellectual independence. Right,
You don't have to have something to hide. You deserve
your privacy. And then no, it's it's fundamental value in democracy.
Kind of this next threat you talk about a little
bit is not just deep fakes. Can you just take

(54:56):
us to the the idea to read fakes? It's yeah,
And so you know people think about deep fakes a lot, right.
So the ability for machine learning to generate a video
from scratch of an event that never happened with a
limited training data set. Um. I think that's that's important
and interesting. I also worry a lot about how that

(55:19):
plays out in the text space. Right. So, there are
a series of classifiers and GPT two is one of them,
and tools today who enable you to take a short
training sample and to generate a lot of believable text
based on that. And I worry a lot about how
that does to the disinformation ecosystem, right, because when you

(55:42):
spend a bit of time studying troll farms and disinformation operations,
they often have to produce a large amount of engaging
and believable text right to sort of like put out
on on a various set of properties or online accounts
or domains, and you know, fake fake profiles, and so

(56:02):
I I do worry a lot about that specific threat,
which I you know, joking equal to read fakes. Um,
how would it play out? Like, how do you see
it playing out? Well? If you run, for instance, a
disinformation ecosystem where you have two hundred sides that you're
pretending have nothing to do with one another, it becomes
a cheap, cheaper, and easier for you to keep two

(56:25):
hundred sides sort of hydrated with fresh content. UM. I
have a wonderful partner who, um, it's a bit cheeky,
and he teaches kids to use sort of like deep
fakes and and read fakes and all that. And I
think that's actually sort of a good response. I think
people should should play with those tools and sort of

(56:47):
understand why they can do what they cannot do, and
have sort of a lot more familiarity with these techniques
so that they can more easily spot them. Um. Last
question you said you're an optimist, or at least in
your Twitter biots as you're an optimist mind focused on
dark patterns. Why, despite everything you've seen, are you so
optimistic because people are great? What makes you? I mean,

(57:11):
I guess I don't know if I have a vol
up to that. Um, Why do you still think people
are so great despite everything you've seen? Because I think
that a lot of what needed to be uncovered with
hard to uncover. I think people worked really hard to
demonstrate that this phenomenon existed. I think people worked hard
to say, look, there are such things as troll form.

(57:31):
This is how they work. I think people worked hard
to say, yes, you know, activists are targeted, and this
is what's happening. And I think despite the problems growing
in complexity and in size, there's always been fantastic people
chasing them and exposing them, and you know, coming up
with creative solutions. You work so closely with all the

(57:52):
tech companies, So do you think they're well equipped to
take on this next challenge. I'm much better off than
a few years ago. For sure. We've come. We've come
from far. You're still optimistic. I'm still optimistic. I mean,
we are in a much better position, sort of like
the tech industry in general than when we were a
few years ago. It's still not perfect, still a lot
to do both from like creating better rules, being better

(58:14):
at implementing them, and creating technology to be moved to
do detection faster, but maybe bringing more humans like you
to the table. I would just say, like adding in
the people who actually have an understanding of humanity, because
I think the thing that seems to keep going missing
in the narrative is the human part. And maybe had
we been paying attention a little bit more to the
psychology of hacking and people and that kind of thing

(58:37):
we you know, and there were maybe more people in
these tech companies at the time, maybe that would have
been something we could have caught a little bit earlier.
More social scientists and tech, more diverse background in tech.
You really can't go wrong with that recipe, for sure.

(58:57):
So what do you think we hit on em? The trolls,
the dark corners of the Internet, and a little bit
of optimism. I would love to hear from you. Are
you liking the episodes? What do you want to hear
more of? I'm trying out this new community Number five
zero three four one zero text me, it goes directly

(59:18):
to my phone. I promise I'm not just saying that.
And here's a personal request. If you like the show,
I want to hear from you. Leave us a review
on the Apple podcast app or wherever you listen, and
don't forget to subscribe so you don't miss an episode.
Follow me. I'm at Lorie Siegel on Twitter and Instagram
and the show is at First Contact podcast on Instagram.
On Twitter, We're at First Contact pot First Contact is

(59:41):
a production of Dot dot Dot Media. Executive produced by
Lorie Siegel and Derek Dodge. This episode was produced and
edited by Sabine Jansen and Jack Regan. Original theme music
by Zander Sing. First Contact with Lorie Siegel is a
production of Dot dot Dot Media and I Heart Radio
Advertise With Us

Popular Podcasts

Dateline NBC
Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

The Nikki Glaser Podcast

The Nikki Glaser Podcast

Every week comedian and infamous roaster Nikki Glaser provides a fun, fast-paced, and brutally honest look into current pop-culture and her own personal life.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.