All Episodes

November 2, 2022 29 mins

In Part Five of LikeWar, the full extent of Russia’s online disinformation campaigns comes to light. It didn’t end after the 2016 presidential election. No, the effort to sow chaos is ongoing - in the US, in elections abroad, and in the global fight against the coronavirus pandemic. 

 

This series is adapted from the book LikeWar, written by series narrator Peter Singer and series contributor Emerson Brooking. To learn more about their research and defense work, you can find them on Twitter @peterwsinger and @etbrooking.

 

Get the book at LikeWarBook.com.

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:09):
Welcome to the Republican presidential debate here on the Fox
Business Network of CNN's presidential debate starts now. In the
center of the stage, tonight businessman Donald Trump. Right from
the jump tonight, Donald Trump made way. It's November. The

(00:31):
competition to be the Republican nominee for presidential election has
just begun to pick up steam, and a new political
Twitter account has emerged on the scene. Posting at ten
underscore gop, it announces itself as Tennessee Gop, the hub
for the state's Republican Party. It looked like a totally

(00:54):
legitimate account. The picture on the Twitter page featured Tennessee
State Seal. The account had all the right values quote
I love God, I love my country, it announced on
its profile. It's soon posted literally thousands of tweets pro Christian,
anti Islam, pro family values, anti gay rights, and it

(01:15):
weighed in on the candidates with tweets applauding Donald Trump.
Mixed in or various lies, but ones that seemed believable
to a readership conditioned to think the worst of the
other side, for example, claiming quote Obama wants our children
to be converted to Islam. According to a new Public
Policy poll, a majority of Republicans still believe that President

(01:37):
Obama is a Muslim. The account also mocked the idea
that Russia had anything to do with election hacking. There's
no doubt now that Russia has used cyber attacks against
all kinds of organizations in our country. And I am
deeply concerned about this. How concerned am I? On a
scale of one to add unconcerned a hundred, and it

(01:58):
ridiculed anyone who's suggested that the claim might be true.
That was part of the account's appeal. Donald Trump had
been lauded on the campaign trail for not mincing words
about his competitors. Jeff doesn't really believe I'm unhinged. He
said that very simply because he has failed in this campaign.
It's been a total disaster. Nobody cares unrelaxed. Go ahead.

(02:23):
We want to do is to replenish the sun. This
Twitter account was just following suit, tweeting out a mix
of pro Trump, cheerleading, far right wing content, and conspiracy theories.
Ten Goop gained over a hundred thousand followers, and each
follower was pushing it out to their hundreds or even

(02:46):
hundreds of thousands or even millions of followers. By November,
just one year after its creation, ten gop was garnering
support from the highest profile supporters of the presidential nominee,
including General Michael Flynn, Kelly Anne Conway, and even Donald
Trump Jr. This meant that on election Day it was

(03:08):
in the top ten of most widely read accounts, more
than most politicians, media outlets, even celebrities. That day, it
was routinely retweeted not just by Republican activists and politicians,
but by most members of the incoming Trump administration. It

(03:31):
was extremely influential in far right political circles, and throughout
all that time, no one stopped to ask who exactly
in Tennessee was running this account. That is, no one
besides the real GOP party in Tennessee, which wasn't behind

(03:53):
the account. It took them three separate complaints before Twitter
finally took action, permanently spending the ten geop account in
August for the executive director of the state's Republican party.
The account had just been a nuisance, a case of
brand impersonation that confused voters trying to find the official,

(04:16):
verified Tennessee geop page. We had no idea, never were
in contact with whoever ran the Twitter account. You know,
for the most part, we soon knew was somebody in
their basement running this Twitter account. But the person behind
ten Goop wasn't just some guy living in his mom's basement.

(04:37):
It wasn't even someone in America. The Tennessee account had
been a sock puppet, a fake account run by a
real person working for a Russian government troll farm. And
it was just one of thousands of fake accounts removed
by Twitter for participating in a strategic disinformation campaign targeting

(04:58):
the US presidential election. Twitter is cracking down on accounts
deemed to be fake. Facebook on Wednesday said it had
suspended a network of fake accounts used by Russian military intelligence.
Company reportedly deleted some seventy million users from its site.
That's equal to roughly of the site's monthly Twitters. I'm

(05:24):
Peter Singer, and this is like War Part five. Fake
news incorporating the interference that targeted the US presidential election
in came from all over the place. It came from
tiny Eastern European countries where teenagers were trying to make

(05:47):
an extra buck. It came from within the United States
on the darkest corners of the Internet, and perhaps most notably,
it came from the i r A. That's not the
Irish Republican Army or an individual retirement account. No, we're
talking about the Internet Research Agency. The Internet Research Agency
was a troll farm, a sock puppet army of fake

(06:08):
online accounts and automated bots spreading synchronized talking points. The
agency was based out of a small industrial building in St. Petersburg, Russia.
My co author Emerson Brooking can explain who was the
mastermind behind the troll farms. This was not part of
the Russian government. This was essentially a marketing firm operated

(06:33):
by one of Russian President Vladimir Putin's cronies. When we
talked about Russian interference in the election, most of the
Russians doing that interfering, running those Facebook and Twitter sock puppets,
they weren't spies or agents of the Kremlin. Instead, they
were basically viral marketers. The i ra A didn't conduct

(06:56):
actual research per se, that is, unless you counted its
information and warfare efforts on how to use the Internet
to target Russia's enemies, set up by one of Putin's
cronies who would later be indicted by the FBI. The
organization started hiring young English literate Russians. Many of them
had majored in the humanities and couldn't get jobs elsewhere.

(07:17):
They'd grown up like millennials around the world, steeped in
global internet culture, which is very much American culture. That
meant when they were asked to pretend to be American
voters to play to American political issues, they were quite
good at it. And indeed, for many of the young

(07:41):
Russians doing this work, they didn't bear any special animus
against the United States. Instead, they saw it as sort
of a creative challenge. The creative challenge to set up
and run thousands of sock puppets and tens of thousands
of body accounts online. They had one main purpose to
export Russia's influence, not by making people love Russia, but

(08:04):
by generating confusion and anger among Putin's foes. They did
so by spreading falsehoods and sewing division among the Americans
who they were masquerading as. To operate a sock puppet
means that you are inhabiting particular identity that you consciously
take steps to make this identity seem and feel like

(08:30):
a real person. You might invent a fake history, background interests,
and the intention is when you write or share content
using this persona, you've convinced everyone else that this is
a real person who's doing it. Instead of trying to
get a real sympathetic person to write your story for

(08:52):
you, you you can just create a fake persona. They would
plant a seat of a story. That story would be
repeated by other media properties would pick it up, so
there would be this chain of citation. Being a paid
troll for the Russian Internet Research Agency wasn't easy. There
were some leaked documents that revealed some of the strict

(09:15):
quotas that these employees were under. They were expected to
work average of twelve hours a day, to post on
news articles fifty times, to maintain six different Facebook personas,
and publish at least three posts a day. They were

(09:38):
discussing ongoing events, and they also had metrics they had
to meet. By the end of the first month that
they were employed, they were expected to have five hundred subscribers.
In many ways, it was like Black Mirror meets the Office,
spending hour after hour in a cubicle and per sinating

(10:00):
someone from Afar on Twitter. They were expected to get
as many as two thousand followers and tweet at least
fifty times a day. Best ball Off worked inside Russia's
Internet Research Agency for three months in late two thousand fourteen.
He'd ride up to twenty fake articles a day spinning

(10:20):
the war in Ukraine. That article would then be posted
by a blogger and then be spread by someone in
the social media department. That was one way, he says,
or bloggers would write fake posts and he'd quote them
in the articles. They bragged about sometimes occupying fifty different

(10:42):
identities at the same time, of how they might be
a retired military veteran and a young African American activist
the next how they're weaving these fictions and stories and
doing so to advance ultimately the objectives of the Russian government.
So you believe that this operation was backed by the Kremlin, absolutely,

(11:07):
he says, also believes it's still up and running. These
ConTroll farms can produce such a volume of content with
hashtags and topics that it distorts what is normal organic conversations.
As it turned out, all that effort by the i
r A had paid off. The stock puppet personas may
have been fake, but their influence was very real. The

(11:31):
debate continues today as to what forces were most important
in swaying the election to a result that surprised even
Donald Trump himself. But what was undebatable is that the
Russian online influence effort, especially combined with its hacking of
the d n c's email system, had altered what both
American media and American voters were talking and thinking about.

(11:57):
Our own research found that it surpassed in face book
discussion of not just political topics, but even topics like
the Cubs finally winning the World Series. It gave a
foreign power influence over the course of the election in
a way that it never happened for a democracy before.

(12:20):
The Russian intent had been to paralyze the country, perhaps
to fatally wound and incoming Clinton administration and to bog
them down with domestic troubles, so America could focus less
on restoring the NATO alliance and less encountering the Russian

(12:41):
invasion of Ukraine and Russian intervention in Syria. Instead of
crippling a Clinton administration and handicapping the United States, the
Russian government succeeded in defeating Clinton and miraculously in electing
someone who would be much more their ally, so, the

(13:02):
Russian government's efforts truly did succeed beyond their wildest dreams.
Days after the election, as the full extent of social
media's power was becoming apparent, Facebook was also taken aback.
The platform, originally created by a college student to help
rate whether his fellow dorm mates were hot or not,

(13:24):
had just become the center of politics, and not in
a good way. Mark Zuckerberg gave a quote that he
quickly came to regret. You know, personally, I think the
the idea that you know, fake news on Facebook, of

(13:45):
which you know it's a it's a very small amount
of of the content uh influenced in the election in
any way, I think, is a pretty crazy idea. Now
he was immediately criticized for this. He was criticized the
large part because you could go to Facebook advertising brochures
and see where Facebook, the company loudly bragged about being

(14:09):
able to influence elections and that was why politicians should advertise,
with Facebook, essentially carving out special rules for politicians. Here's
how Facebook's VP of Global Affairs and Communications splits it.
We have a responsibility to protect the platform from outside
interference and to make sure that when people pay us
for political ads, we make it as transparent as possible.

(14:31):
But it is not our role to intervene when politicians speak.
And that's why I want to be really clear with
you today. We do not submit speech by politicians to
our independent fact checkers, and we generally generally allow it
on the platform, even when it would otherwise breach our

(14:53):
normal content rules. What Zuckerberg and other tex CEOs had
to grapple with here was how online attention and virality
and its influence on our real world behavior was the
very core of their business models. They hadn't set out
to alter politics. The networks that they had created were
for profit businesses, designed first and foremost to make money.

(15:18):
And the more users, the more likes, the more shares
and follows meant more dollars, not just in revenue, but
in all important share prices. If you were to go
back ten years, the earliest stuck puppets were generally for
commercial interests. It was fake comments you'd find on YouTube

(15:40):
or Instagram or Facebook. It was fake followers. It was
things that you bought with money, generally for something commercial
just to boost your page. You've probably seen the sort
of thing I'm talking about where someone will be offering
five thousand extra Facebook page likes four ten dollars. So
today we're gonna see what happens if you try to

(16:00):
buy followers on Instagram, Because if you want people to
pay more attention to your tweets, then you either need
to work very hard or buy your followers instead. The
source and objective of the online influence efforts had changed,
but the incentives for social media companies remain the same.
And beyond the business model, the Zuckerberg's of the world

(16:21):
had a philosophical belief which also had a legal rationale
behind it. They believe their websites to be platforms, not publishers.
They were responsible for giving people a way to voice
their opinions, not responsible for what those people, real or fake, voiced,
nor any of the consequences. Congressman, we do not want

(16:42):
to become the arbiters of truth. But in the year
following the election, all that started to shift. Facing criticism
from both Congress and users of their own networks and
even their own employees, do you see a potential problem
here with a complete lack of fact checking on political advertisements. Well, Congresswoman,

(17:03):
I think lying is bad, and I think if you
were to run an ad that had a live that
would be bad. So you won't take down lies or
you will take down lies. I think it's just a
pretty simple yes or Now. In a democracy, I believe
that people should be able to see for themselves what
politicians that they may or may not vote for. You
won't take any character for themselves. Zuckerberg and other tex
c eos began to acknowledge the immense political power and

(17:26):
influence they wield, but it was a slow burn, a
dial up pace for a digital world. The Russian operation
that targeted the US election was finally exposed and taken
down by Facebook about a year after the election. A
year after Trump had been elected, in Facebook released its

(17:51):
first of many reports squarely putting the blame for a
foreign intelligence operation on the government of Russia. In Facebook
launched its first quote election war room and had specialized
teams basically patrolling the network looking for foreign activities targeting

(18:16):
the US midterm elections, and then Facebook took that model
overseas first in Brazil, Facebook has blocked hundreds of right
wing activists accounts in Brazil as part of their campaign
to tackle fake news, then Ukraine, and then Taiwan. Facebook
has rolled out stricter controls on political ads in Taiwan

(18:38):
and now those who want to advertise about social issues,
elections and politics are required to provide proof of identity.
And all of this was a lead up to what
Facebook employees described as game day, and that was the election.
By the time of the election, it was a very

(19:00):
different social media environment. There was an awareness of the
power of the platforms and a hyper vigilance of the
potential threats that had been absent in the previous presidential election,
and scarred by years of critique, the companies took actions
they claim weren't possible in the past. So one of
the things that we're going to be doing is at

(19:20):
the top of Facebook and Instagram, UM, we're going to
be putting accurate information about how to vote by mail
and how to do it accurately. Facebook has created a
Voting Information Center to dissuade political misinformation around voting and
will help register new voters. Facebook apparently is going to
limit or even not by political ads on the platform.

(19:43):
A week before the election, Facebook, Twitter, and other companies
identified various foreign interference campaigns and took down multiple false
front networks, and they partnered with the US government cybersecurity teams,
local government officials, and civil soci ID organizations to protect
information about where and when you could vote. I was

(20:05):
part of one organization that had over a hundred researchers
looking for any trace of foreign interference activity or any
obvious disinformation campaigns. They were aimed at delegitimizing election processes. Now,
these companies were still far from perfect. There's plenty of
stuff that got through the cracks, plenty of content that

(20:25):
was allowed that should not have been, but there were
no large interference networks that were able to target the
election where the American people had been kept completely in
the dark for months after the election. This time they
had been updated every step of the way, and almost

(20:48):
none of the tactics that had worked in worked in.
The operation that talked at the US election was finally
exposed and taken down the September off of the election.
This time around, we've seen an operation being taken down
across multiple platforms in cooperation with law enforcement September before

(21:09):
the election. And that's a really, really important difference. Catching
it before it can actually reach the day that it's
targeting is much more effective than catching it a year
down the line. It was a huge success by one measure,
but there's an old military adage about not fighting the
same war twice, and that was true of the Russians
and the other threat actors who were targeting the election.

(21:32):
They weren't acting like it was anymore either. As detection
methods and content moderation policies evolved, so did their disinformation strategies.
For example, in most of the fake Russian accounts were
fairly simple to i D for either the tech companies
or any expert putting any effort into it. The often

(21:53):
poor English and stolen profile pictures grabbed from the open
Internet often gave the Russians away. If you look at
the count, you see a lot of programline content, you
see a lot of pro Trump content, spelling mistakes, the
English is not the first language. It's more likely to
be a female account, not a huge amount of followers,
and it's a lot of tweeting at people, certain articles,

(22:16):
alternative sources, that sort of thing. And if you go
through the account and it consists of nothing, but these
recurrent themes. It's more than lying that it's a Russian troll.
There is no scientific method for it, because you do
get these people anyway. You can stay in this game
long enough when you begin to recognize trolls in the
Russian trolls who had once run sites like ten gop
weren't the center of the effort. Instead of writing fake

(22:40):
articles themselves, they hired actual American freelancers to produce their content.
These freelancers had no idea that they were working for
Russian trolls. In fact, a lot of these freelancers talked
about them being some of the nicest editors they've ever
worked with, some of the most generous. It is off

(23:00):
to be a freelancer. Pay is not good, but the
Russian trolls were quite generous with their money. But even then,
even with the AI generated profile pictures, the use of
real Americans for the writing and promotion of this work,
the actual number of people who read anything from the

(23:20):
Russian trolls was absolutely minis cool. It could be measured
in the thousands, and the result was that Russian sock
puppets and fake accounts didn't have the same impact on
the election that they did in social media companies make
big changes to their platforms to make sure of that,
and that's a huge step forward. But the fight against

(23:41):
disinformation is far from over. Because now Russia doesn't have
to try so hard to inject falsehood from Afar. There's
enough of it being created within the United States to
have a powerful and awful effect. The Russians didn't need
to create fake accounts to push conspiration c theories on
everything from lizard people in Q and on. Is this

(24:03):
idea that somehow there's a group of powerful figures who
are in fact not human, their reptiles or their lizards,
perhaps extraterrest. Q is a patriot, but we do not
know who Q is. People believe that Q is someone
very close to President Trump. To space satellite stealing the election,

(24:23):
there was an entire ecosystem of domestic voices to do it,
and they were bolstered by a broader media ecosystem that,
just like those Macedonian teens, saw profit and power in
the lies. They would get support from far right politicians.
They no longer had to tell the truth. They could

(24:43):
invent their own conspiracies out of thin air, and millions
of people would believe it and President Trump would promote
it during the pandemic. The Q and on movement has
been appears to be getting a lot of followers. Can
you talk about what you think about that and what
you have to say to people who are following this
movement right now? Well, I don't know much about the

(25:05):
movement other than I understand they liked me very much,
which I appreciate, But I don't know much about the movement.
I've heard these are people that love our country. In
this sort of environment, there is no need for a
coordinated operation from the Russian government. And the scary thing

(25:25):
is it worked. Some of the most bizarre conspiracy theories
that once existed only in the darkest corners of the
Internet became mainstream, making their way into conversation in the
halls of Congress and the White House, culminating in what
became known as the Big Lie, an overlapping concoction of
some fourteen different conspiracy theories, each and every one of

(25:49):
them proven false by the courts, including by judges appointed
by Donald Trump. But the Big Lie took off online
and then in the real world, all the way up
to being voiced by the once leader of the Free World.
So pure theft in American history. Everybody knows it that election.
Our election was over at ten o'clock in the evening.

(26:11):
We're leading Pennsylvania, Michigan, Georgia by hundreds of thousands of votes.
And then late in the evening or early in the morning, boom,
these explosions of bullshit. What had once been unthinkable and
unspeakable was now being voiced by the loudest voices of all.

(26:34):
In just a decade, the constant cycle of lies and
disinformation had mainstreamed itself into the American political system. Yet
the poison wasn't just hitting American democracy. It was also
harming American public health. For in a pandemic had struck
the world, and here too, Russia had quietly pivoted, weaving

(26:57):
itself into what public health professionals called the quote in
phodemic of falsehoods that surrounded the coronavirus pandemic and how
to prevent it. I still come back to the idea
of a booster shot. I mean, yes, you're right, that
seems you know what, that sounds to me like a
money making operations. You know, when these first came out,

(27:18):
they were good for life, then they were good for
a year or two, and I could see the writing
of the war, I could see the dollar signs in
their eyes of that guy that runs Biser. It was
just one more illustration of the ongoing danger of these
online disinformation campaigns. Whether they come from Russia or from
within the United States, their influence on real world actions

(27:41):
could no longer be ignored. And that's what's next on
Like War, the very real consequences of ignoring online threats,
and a look at how social media facilitated the January
six insurrection. This is a production of iHeart Podcasts, Graphic

(28:06):
Audio and Goat Rodeo. Karas Shillen That's Me is the
series lead producer. This episode is just one of a
seven part series. Find other episodes wherever you get your podcasts.
If you'd like to dive deeper into the work of P. W.
Singer and Emerson Brooking, you can access the full audio

(28:27):
book Like War, on which this series is based wherever
you get your audio books. Writing and editing from Karash Shillen,
Production assistants from Isabelle Kirby McGowan. Senior producers are Ian
en Wright and Megan Nadowski. Please share this series with

(28:48):
the hashtag like War to find other conversations about the series.
Thank you for listening.
Advertise With Us

Popular Podcasts

1. The Podium

1. The Podium

The Podium: An NBC Olympic and Paralympic podcast. Join us for insider coverage during the intense competition at the 2024 Paris Olympic and Paralympic Games. In the run-up to the Opening Ceremony, we’ll bring you deep into the stories and events that have you know and those you'll be hard-pressed to forget.

2. In The Village

2. In The Village

In The Village will take you into the most exclusive areas of the 2024 Paris Olympic Games to explore the daily life of athletes, complete with all the funny, mundane and unexpected things you learn off the field of play. Join Elizabeth Beisel as she sits down with Olympians each day in Paris.

3. iHeartOlympics: The Latest

3. iHeartOlympics: The Latest

Listen to the latest news from the 2024 Olympics.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.