All Episodes

July 27, 2025 31 mins

Russia, China, MAGA — how social media is weaponized to promote misinformation and conspiracy theories, leading to nefarious calls-to-action.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
I'm John Cipher and I'm Jerry O'Shea. I was a
CIA officer stationed around the world in high threat posts
in Europe, Russia, and in Asia.

Speaker 2 (00:09):
And I served in Africa, Asia, Europe, the Middle East
and in war zones. We sometimes created conspiracies to deceive
our adversaries.

Speaker 1 (00:18):
Now we're going to use our expertise to deconstruct conspiracy
theories large and small.

Speaker 3 (00:23):
Could they be true or are we being manipulated?

Speaker 1 (00:26):
This is mission implausible.

Speaker 3 (00:31):
Let's return to our conversation with Clemson University's Darren Linnville,
the expert on the intersection of social media and conspiracy theories.

Speaker 1 (00:42):
Much of what you look at in this story about
social media and how it's being weaponized and such as
a negative story. Is there any reason for optimism? Are
we doing better than we did a few years ago.

Speaker 4 (00:52):
That's a difficult question to answer because I think in
most ways we're probably doing worse because we backed off
from trying to moderate social media to any degree. A
lot of that we've seen after Musk's purchase of X
he backed off from moderation and the other platforms did
as well because there was just no political impetus to

(01:13):
do so anymore. We've seen a lot of social media
fracture into smaller platforms, a lot of private platforms, like
I said earlier, like Telegram, and that increases the potential dangers.
But if there is a potential hope, I think it's
ironically tied to one of the bigger dangers, which is AI,

(01:35):
because while AI can potentially do a lot of harm,
I think it can still be used as a defensive
tool as well. It can be used to identify malign influence,
it can be used to respond to malign influenc There's
a recent study that came out that looked at the
use of artificial intelligence to engage with conspiracy theorists and

(01:58):
talk them off the ledge. It was actually found to
be very effective because it can respond so fast. So
if you're if you're talking to a conspiracy theorist, they
are steeped in their conspiracy theory, they know everything before
you can respond, and they know how to respond to
your response. But AI is even faster, and it can

(02:18):
pull that you know, very specific counter argument deep out
of the ether and give it to the conspiracy theorist,
and it was found to be very effective at reducing
belief in.

Speaker 5 (02:29):
Very various conspiracy theories.

Speaker 4 (02:31):
So I think that if there is a hope, it
is the thing that many people fear most, and that
is AI.

Speaker 2 (02:36):
So Suckerberger Musk are going to save us.

Speaker 6 (02:38):
In the end.

Speaker 5 (02:39):
Oh, I definitely didn't say that, Sam Altman.

Speaker 4 (02:42):
I guess the real question is whether AI destroys us
before it saves us, to bur the village to save it.

Speaker 2 (02:48):
Yeah, So I was wondering if you could comment on
this is blatantly political, but I still got to ask
people in the administration who are openly parroting Russian disinformation,
I mean Russian talking points coming right out of RT
Senior people in this administration. Are they being influenced by
what they read. Is they've got huge bureaucracies and they

(03:10):
have AI, they have access to truth. And it goes
from on the one hand, ludicrous like is their gold
and Fort Knox. It's like, well you friggin run it right,
you know, just go cout.

Speaker 6 (03:21):
Yeah, I mean, you know.

Speaker 2 (03:22):
To other things like congressmen saying that the US is
sponsoring bio labs in Ukraine and that gives the Russians
the right to attend complete utter bunk them.

Speaker 4 (03:34):
I think that the most likely answer is just that
Russia has become very effective at seamlessly integrating their talking
points into the conversation, influencing the broader conversation, and because
all of these political leaders are in that conversation, are
steeped in that conversation. I mean, we see how much
time people like Musk and Trump spend on social media.

(03:58):
They pick it up organically. When we were talking about
usaid what was that two months ago now being shut down,
Musk actually retweeted a Russian made video from the Matroishka
campaign Matrioshka campaign, and helped spread that video. It was
a video that was made by the Russians, disseminated by
the Storm fifteen sixteen campaign, and Musk red spread that video.

(04:21):
And it's not because he was paid to do so,
it's because he picked it up organically from that broader conversation.
We've seen that with a number of different narratives that
the Russians have made. You talked about the yacht story
from Storm fifteen to sixteen earlier. Jade Vance repeated that
narrative before he was vice president. So I think they've
really just become extremely The Russians, I mean had become

(04:42):
very good at their job. This is very different than
what they did back in twenty sixteen when they were
operating these sort of artisanally created social media accounts and
trying to become part of the conversation. Now they're just
paying existing influencers. And the benefit of that is a
they're not running anything themselves, nothing's going to get suspended.

(05:03):
But the much bigger benefit is they're not trying to
be part of the conversation. They're just leading the conversation.
They're paying the people who are leading the conversation to
say what they want them to say.

Speaker 1 (05:15):
Or even just amplifying malign actors exactly obviously what we're
seeing now obviously with this protests on ice raids in
Los Angeles and probably expanding to other places. It's a
place that's ripe for disinformation or misinformation to be spread.
If you follow like you know, some media outlets and
pictures and things can make you think LA is on fire,

(05:37):
like it's a massive people you know here I'm talking
about it's an insurrection and all these type of things.
I recall, for example, when I lived in Moscow when
the parliament was stormed by tanks and things like that
in the early nineties. I remember, you know, my family
and others calling and like, oh my god, you know,
I watch it on TV. It looks like, you know,
you're in the middle of a war and everything. But
if you weren't within one or two block radius, you'd

(06:00):
be in this massive city.

Speaker 6 (06:01):
You could be going to museums, you could be shopping.
You wouldn't even know that this was going on. But
if you're in those.

Speaker 1 (06:06):
Two block radius, yeah, it was pretty it was pretty dicey.
And I think that's what we're seeing in places like LA.
You know, as a block or two around one federal building,
but the story becomes that La is on fire, and
so that certainly allows bad actors to spread misinformation.

Speaker 4 (06:21):
I think that this kind of the false story is
spread in a few different ways, and the main way
that they spread, the first way that they spread is
just like your relatives who were concerned about your safety,
it's people who received some piece of information out of context,
and social media facilitates this. Social media is a game

(06:43):
of telephone and people are getting information out of context
all the time. So, you know, regarding the ongoing riots
in LA, for instance, I saw an example of an
individual who shared a video of created from this video
game called Arma. It's this really ultra realistic video game

(07:04):
where with an aircraft flying through the air and being
shot at from the ground, and they claimed, oh, look,
the protesters are shooting at aircraft. It was very clearly
not that, and I think that user was making a
joke because this video game has been used to create
a lot of false videos about the war in Ukraine
and to the point where in certain communities it's it's
just become a joke. But you take it out of

(07:27):
the context of that person's account, and which happened very quickly,
and suddenly there's hundreds of thousands of people believing that
video because they just didn't understand the context.

Speaker 5 (07:37):
And we see that repeatedly.

Speaker 2 (07:40):
We're using the term social media, which usually defind I
think as a sort of smaller you know, account, and
yet you know, there's sort of an American provda out there.
I'll just say it. You know Fox News, which is
not really social media. It's it is as mainstream that
you can get. I do look at Fox's every day
and the narrative coming out of that is that Los

(08:00):
Angeles is inflames. So it's not just social media sounds
to me like micro but this is really.

Speaker 4 (08:06):
A macro thing, absolutely, but it happens at both the
micro and the microcro level.

Speaker 5 (08:12):
Absolutely. I absolutely agree with you.

Speaker 4 (08:13):
And this is the second sort of way that false
messages spread is when somebody is trying to engage with
with a particular audience. That might be, you know, an
audience of viewers that are of a particular partisan persuasion
that they know they're trying to reach, or it might
be a particular conspiratorial group who they see something and

(08:35):
they make assumptions about it. Years ago, we saw the
same thing happened the story of Palata bricks show up
at protest sites. It's clear that, oh, that pallett of
bricks is there because George Soros paid for it to
be there, and that story spreads across social media. We
saw this in the past week in la as well,
and that's because again, people are communicating with very specific audiences,

(08:57):
very specific communities, and specific communities are sort of attuned
to expect that message and spread that message when they
want to believe that message, like we were saying earlier,
and when when an audience wants to believe something, they're
going to believe it, and they're very likely gonna going
to share it, and.

Speaker 1 (09:13):
They're too lazy to go take a picture of real
pill out of bricks in Los Angeles.

Speaker 5 (09:17):
No, they.

Speaker 4 (09:21):
Yeah, these things pallets of bricks, they're they're ubiquitous. We
ignore them when there's not a riot, but suddenly there's
a riot there everywhere.

Speaker 2 (09:28):
So it's dangerous telling the truth in an authoritarian regime,
where John and I have both lived in authoritarian regimes.
And and Dan, what you're doing now, you know, trying
to trying to deal with disinformation. Do you find that
there's any pressure on you or you're being attacked or
your colleagues are being attacked for you know, for pushing

(09:49):
back on Russian disinformation or on certain political persuasions, pushing.

Speaker 6 (09:55):
Some people call misinformation. You know, your First Amendment right
here taking.

Speaker 1 (10:00):
Away our limit rights to lie about things, which I
guess I had.

Speaker 4 (10:04):
Yeah, And to an extent, you know, I agree with
that perspective. You know, I think people have always said
things that are wrong and always will say things that
are wrong. You know, my research hasn't by and large
focused on misinformation. It's more focused on line influence, purposeful
lying rather than just people saying things that are wrong.
It's very, very hard to stop that. Certainly, though, Jerry,

(10:25):
We've had pressure. I've had to share my emails with
various offices in both Senate and Congress, answered a lot
of Foyer requests. I was part of the Twitter files
ongoing story. So certainly we've we've felt that influence recently
had at an NSF project that was shut down by
the administration.

Speaker 5 (10:48):
National Science Foundation project got shut down.

Speaker 4 (10:51):
But compared to a lot of my colleagues, we've we've
got enough very light colleagues. At Stanford, the standard of
infanet At Observatory waspletely shut down after a series of lawsuits.
Work at the Harvard Kennedy School has been interrupted, which
is why I'm still so vocal. If I can't be
vocal as a straight white man with tenure who can

(11:14):
be vocal fair enough.

Speaker 1 (11:17):
Bellfull, we go much further down the rabbit hole.

Speaker 6 (11:19):
It's time for a quick bring.

Speaker 1 (11:26):
And we seem to be have learned a lot in
the last few years about Russian malign influence. You know,
some people studied it for a long time. Others are
just coming to learn about that. What are we learning
about the Chinese how is that? How are they different
in this space?

Speaker 4 (11:38):
They are entirely different. They are the polar opposite of
what Russia is. And Russia is the best in the
world at what they do, and China is the best
in the world at what they do. I think a
lot a lot of researchers that have looked at the
way that Chinese operations differ have been critical of it.
You know, it doesn't get any engagement, it doesn't you know, nobody.

Speaker 5 (12:01):
It's culturally unaware.

Speaker 4 (12:03):
But I think some of those critiques don't fully appreciate
what it is that China do. So Russia promotes ideas,
and they're very, very good at that. You know, they
use influencers, They want you thinking about ideas. They want
certain things to be more palpable, to be more real
to particular communities, particular audiences. What China does is they

(12:24):
want to demote ideas, especially ideas that are critical of China,
and they do that in a few ways that I
think are very effective. And I think both of these
things are sort of rooted in how each of these
countries have operated in the information space historically. You know,
like in Russia, they have a long history of active measures,

(12:46):
as you were saying earlier, John, and we see that
today they continue to apply some of those same approaches,
Whereas in China we see them having complete control of
their own media space. You know, you can't even post
a picture of winning the poo in China because it
looks too much like President She. And they apply that
same sort of mindset to the Western information space. So

(13:09):
what they'll do is, for instance, they especially target members
of the Chinese diaspora. So if we worked with New
York Times on a story last summer about a Chinese
dissident named Dong who was critical of She. He wrote
for places like Foreign Policy, writing articles that were critical

(13:30):
of the administrative of the Ship presidency in the PRC,
and so he was targeted by fake accounts that were
critical of him. But it wasn't just him that was targeted.
They started to target his teenage daughter, which angered me personally.
I have teenage daughters, and they targeted them with messages
suggesting that this young girl was a prostitute, with messages

(13:54):
that offered her services, that they made posts offering a
reward to harm her, And they made these posts on
her school's social media accounts, so they weren't getting any engagement.

Speaker 5 (14:08):
That's true.

Speaker 4 (14:09):
You know, nobody was hitting retweet on any of these posts,
but they.

Speaker 6 (14:13):
Had needed to get the message. We're getting the message.

Speaker 4 (14:15):
The people that they had an audience of one, and
it was it was this girl's father. And they didn't
just post it on social media, they also posted it
places like trip advisor. Trip Advisor has worked for years
to up where they land on a Google search, and
so if you were to search this young girl's name,

(14:35):
absolutely trip Advisor would be the thing at the top
of the rank. And it's only because you know, we
were able to work with these companies to get these
posts taken down that that's not the case for the
rest of this young girl's life. And so the Chinese
are very good at that. They also engage in these
same kind of similar tactics, not just targeting people, but

(14:57):
also targeting ideas. So they'll flood a hashtag so that
it becomes unusable by sending thousands of accounts using that
hashtag and completely unrelated ways. So they don't target ideas
so much as they target systems.

Speaker 1 (15:14):
And is it one organization, is it a variet organizations
operating sort of going in the same direction, or what's
the bureaucratic piece behind.

Speaker 4 (15:21):
Yeah, just like Russia, it's multiple organizations going in the
same direction. The main organization in China that we know
the most about is the Chinese State Police. It's a
campaign usually called spamouflage, sometimes Dragonbridge, depending on the organization
that's naming it. But it's the Chinese State Police that

(15:42):
target these members of the diaspora community abroad. But there's
other campaigns as well. Individual states all have an inside
of China. For instance, we'll have marketing companies that will
sporadically run propaganda or disinformation operations. But of course, probably
the single, I guess and most expensive piece of the

(16:03):
Chinese influence operation, just like Russia, is state media. China
runs a huge, multi billion dollar state media operation, and
a lot of that isn't necessarily targeting their own people
who can't even view it or read it, but targeting
the Chinese diaspora.

Speaker 2 (16:22):
Do you have any questions about CIA or the agency that.

Speaker 4 (16:25):
You'd like, Yeah, I mean, I am endlessly fascinated in
the sort of influence operations that the US has engaged in.
Probably only a few of them are public at this point,
there a few that I know.

Speaker 6 (16:38):
I don't think the.

Speaker 1 (16:39):
US does it much or very well anymore. I mean,
obviously in the fifties and sixties, in the period of
time when we were trying to play the same game
and we found out that we were bad at it
and that maybe the truth was a better weapon. And
after the reforms of the seventies, I think the military
gets involved in some sort of leader what they think,
battlefield misinformation whatever you want to call sciops.

Speaker 2 (17:01):
With siops, I think you might find that the agency
was trying to understand like the question about community, what
is the jihadist community or what do they look like,
how do they organize, how do they get their message out?
How do we get inside their messaging? And so there
were things I think I can say this, There were

(17:21):
things like, okay, so you know, if you want to
impact on how al Qaeda messages it right, and so
it's kill Americans killed Jews. Okay, So if you commit
and say no, we should be nice to Americans. First
of all, no one's going to listen to you. You
can't even get inside the community. But if you commit
and say something like yeah, okay, Americans aren't very good

(17:41):
but let's talk about how come none of bin Vaden's
family members, no Saudis are going out there, No senior
Alqaeda members are out there. Their kids are blowing themselves up.
They just get us Egyptians to blow ourselves up. What's
up with that? You know, there are ways to get
inside the community to sort of saying what the Russians
were saying, to tweak or to pry at weaknesses that

(18:05):
are extant and real inside of a community.

Speaker 1 (18:09):
It's also not meant to be publicly facing.

Speaker 2 (18:11):
Right, No, no, no, And in fact it's we're crippled to
do that. I don't think we do it well simply
because we're not allowed to operate in English.

Speaker 3 (18:19):
For example, right.

Speaker 2 (18:19):
Because we're not allowed to, it's not allowed to wash
into US politics or narratives or community. So you know,
we can't do it in English, even though they may
do it in English. So we have to take special
care not to impact on information that Americans could consume
our Westerners. So as a democracy, I think we understand.

(18:42):
As an intelligence community, we understand how important these things are.
But the way our democracy is structured, we're never going
to be able to nor should we be able to
do the sort of thing that Russia does or China does.

Speaker 4 (18:56):
Tit for tat becauses is to have to fight with one
arm tied behind our back, which is good.

Speaker 5 (19:00):
I don't worry that.

Speaker 4 (19:01):
Yeah, no, I would prefer that fight, but certainly it
makes it for a more difficult place.

Speaker 1 (19:07):
But the fight we were better at, Like we talked
about radio for Europe and that type of stuff was
using the truth as a weapon in places where What
I worry about now is our politics. Our leaders are
openly using falsehoods and misinformation for purposes, so we are less,
we have less credibility.

Speaker 4 (19:26):
Probably the other piece that I'm worried about is that
we don't even have Hollywood as much of a weapon
as we did in the past. You know, we're now
making movies for China and following the dictates of China
for what appears in our movies rather than Rocky four.
I've always felt that if more politicians went and watched
Rocky four, we'd all be better off.

Speaker 2 (19:45):
It is a former agency officer a brickbit I would
throw it at Hollywood. Is this trope where there's always
like the rogue CIA agent who's good, but like the
institution is bad or the you know, the senior leadership
is BA. There's always some sinister force, which I get
is like, I get it's a fun trope, but you know,
institutions thousands of like beer crads who aren't paid very

(20:08):
well trying to do the right thing, and it just boring.

Speaker 4 (20:11):
Though it's Yeah, most of real world is unfortunately boring,
which is why we have conspiracy theories.

Speaker 1 (20:18):
Yeah, you know, we're fighting a losing battle. But Darren,
thank you so much for your time. It's really interesting
and we're really glad what you're doing. I know there's
pressure on you for doing that, but I think we
all need to try to push. You know, a post
truth world is not a good world for the United States,
I don't think.

Speaker 2 (20:32):
And we all need to take spot Patrol, right, we
all need to take the new updated version.

Speaker 4 (20:37):
Yeah, come August, it'll be out and I'll be excited
to share it.

Speaker 2 (20:40):
Where do people find it?

Speaker 5 (20:41):
Spotdatrol dot com.

Speaker 6 (20:43):
And we're over sixty so that's a problem.

Speaker 5 (20:45):
You'll need to take it twice than a going flip.

Speaker 3 (20:49):
We'll take a quick break. We'll be right back, John
and Jerry. This is your producer, John rejoining you and
also your producer Rachel Harner. Hey, everyone, I thought it'd
be fun to take Daron up on his offer and

(21:10):
let's try playing spot the troll. Guys. Pull upspotthtrol dot
com on your computers. Okay, click on profile one Chloe Evans.

Speaker 2 (21:20):
She seems so nice.

Speaker 3 (21:22):
So this is Chloe Evans. She's a student in Atlanta.
She joined in June twenty fourteen. She has a lot
of posts here about Obama being responsible for a chemical
plants explosion, quotes about how life is terrible.

Speaker 6 (21:35):
What do you guys think, I'm going to a troll?

Speaker 2 (21:38):
She says. She used to be a model, so I'm
going with troll. I don't think that's I don't think
that's true.

Speaker 7 (21:44):
Rachel, Yeah, I'm getting some troll vibes from this.

Speaker 3 (21:48):
She's a troll. Hundreds of Twitter accounts like Chloe were
activated through twenty fourteen twenty fifteen. Based in Saint Petersburg, Russia.
Oh nice, and some of the signs are she pushed
hoax events that never happened. Her profile images of an
attractive woman, something that always works. According to the site

(22:10):
and my own personal experience, there.

Speaker 2 (22:12):
Was nothing in there, and that's sort of true she
was or anything? Is I am a model? Like English
speakers don't talk like that.

Speaker 3 (22:17):
Well, that's another thing it says is that there were
no personal posts. It was all political. She had nothing
to say about her own personal life, and that was
suspicious profile too. Harmonie Anderson, the college girl who managed
to say stay conservative. This is my second account where
I share my political and social views. She's from Ankonny, Iowa,

(22:39):
and she joined in September twenty nineteen, and she posts
here about Pennsylvania vote rigging. A lot of things here
about how the two thousand election was rigged, legals voting.
Here's something about how great Milania Trump is, how she
should have been on the cover of Vogue, how Biden
was losing his mind.

Speaker 2 (22:59):
Oh, she found James Wood, the wacko Hollywood actor.

Speaker 7 (23:03):
So what is she?

Speaker 2 (23:04):
You know what? She could be real, but just like
in the club, she's killing people with this bullshit. But
this seems more real. But it seems like she's also
like deeply maga.

Speaker 7 (23:15):
She uses a semi colon at one point, which she
feels a little I don't know, what do.

Speaker 2 (23:19):
You gotta get semi colons?

Speaker 6 (23:21):
I voted troll.

Speaker 3 (23:22):
I stopped using semi colons because a semi colon does
what a period does, So just use a period. Malcolm
Gladwell taught me that not personally.

Speaker 2 (23:29):
She follows Milania Trump, so who would do that? I'm
going with troll?

Speaker 7 (23:33):
Yeah, I do feel I feel like it's a troll.

Speaker 3 (23:35):
Harmony is a troll. We attributed to the Russian Internet
Research Agency in March twenty twenty, and then they worked
with Twitter to have her account suspended. Lack of personal
info again, nothing about family, schoolwork, friends, pets, another attractive photo.
Troll Accounts like Harmony often retweet prominent voices. These may

(23:56):
include positive and uplifting content to gain credibility connect with
their target audience.

Speaker 2 (24:02):
Also, their grammar is really good but very simple in
both right. I mean it's like you've run it through chat,
GPT or something.

Speaker 3 (24:10):
This is a CIA analysis. Now, now you're getting into
a level of reading a person. Okay, Profile three Christopher
Warwick lives in Columbia County, Indiana, and he is reposting
a lot of tweets about stand with the soldiers and
a lot of pro military tweets. They don't seem particularly political.

(24:32):
Here's a selfie of him and his daughter in a tractor.
Here's something of Jesus praying for the world. Here's a
photo of a barbecue place he recommends for lunch. And
here's a tweet of meme. It's a picture of I
guess God saying I didn't say the end of the
world would be signaled by trumpets. I said trump slash Pence.

(24:53):
So if Trump Pence signals the end of the world,
are they good? Are they bade?

Speaker 2 (24:59):
I don't think is there anymore?

Speaker 1 (25:00):
Is he good?

Speaker 7 (25:01):
There's also put up about COVID.

Speaker 6 (25:03):
Yeah, so I'm saying he's real. What do you say, Rachel?

Speaker 7 (25:07):
I mean there are more kind of those family cues. Right,
there's the photo of his daughter in a truck. It
looks like the same kid that we see in the
first photo that we don't get a good picture of
his face. He only has ninety four friends, which I
don't know. Maybe seems low to me, but I think
I'm going to go legit on this one.

Speaker 3 (25:26):
He's real. He has a kind husband and father living
in Columbia City, Indiana. We can confirm he is real
because we know him personally. He has posts about his
life and his community, like the lunch spot. He does
not push out constant messages supporting extreme views. For example,
while some of his political posts are right leaning, he
also pokes fun at President Trump, like the trump Pets

(25:49):
all right, profile four, Power to women, American feminist. She
needed a hero, so that's what she became. Also a
lot of memes, how would you just these means?

Speaker 6 (26:00):
Bernie memes feminism, a lot of feminism.

Speaker 7 (26:03):
Stor this is much more like a liberal, less leaning account.

Speaker 6 (26:06):
For sure, we.

Speaker 3 (26:07):
Are going to fight to pass the long overdue Equal
Rights Amendment, Senator Bernie Sanders. What purpose would the Russians
have of or would a troll have of putting out
this message? Are they trying to make it so over
the top with its liberal messages that it kind of
mocks itself that it will turn people off to those

(26:30):
liberal and feminist points of view.

Speaker 1 (26:32):
Well, I think there's a benefit if you're a foreign
actor to just get communities fighting each other to move
people to the extremes. Right, So the country arguably works
best when there's sort of both parties are more towards
the middle. So your goal is to move people to
the extremes on both extremes, and in fact you know
in the twenty sixteen election, that was part of the goal.
They were trying to get black voters, who they saw

(26:54):
as liberal voters, to be so upset with the system
that they just didn't vote. Because both sides are the same.
Same time, they were pushing hard on the right to
get people who never voted before to get so angry
to come out and vote. So they were trying to
support the Trump campaign by appealing to both sides. And
I think that's what you might be going on here
if you think it's a troll.

Speaker 2 (27:13):
But these are like calling for violence, and they're not
all they're not completely wacky. You know, let's pass the
Equal Rights Amendment. You can be forwarder against it. That's
not huge.

Speaker 7 (27:23):
But read that that caption because it has a lot
of typos and weird syntax.

Speaker 2 (27:27):
Okay, well typeo's and syntax means that they either went
to school down south or they're Oh.

Speaker 7 (27:34):
Even even the last line, how cool would this be?
It just feels a little it's a little too formal.
There's a typo right up the top. What's going on
with this country, imagine is dozens of women women gathered
to make decisions on men's health like ineffective trolling. Yeah,
I say troll.

Speaker 6 (27:50):
I said Reo, But I'm wrong.

Speaker 3 (27:52):
You guys are right. John's wrong, Russian troll. She has
an affinity group with no clear organizer power to women,
presents itself as an affinity group rather than a person.
This is likely because Instagram and Facebook users don't typically
follow non famous strangers for or against. It takes advantage
of its followers passion to encourage extreme polarization and suggests

(28:17):
that compromise is impossible. So that's what you were saying.
Create conflict profile five Amy g daughter sister, proud black American.
I tend to get political New York, New York. Joined
January twenty, nineteen, thirteen thousand followers, and it says no
bad vibes. Well, how would you describe what we see

(28:37):
in her posts here?

Speaker 2 (28:38):
So this is tricky because she's got the halshtag. Trump
is a Russian asset, Russian interference. So with the Russians
put out stuff that says that they are responsible for
you know the people who have to be careful of
Russian interference.

Speaker 3 (28:53):
Here's something negative about Roger Stone. It's hard to think
of something positive though.

Speaker 7 (28:57):
Yeah, it is a really wide spectrum of that she's
talking about. Right, it doesn't seem very targeted. You know,
we have the kind of joke about Trump at the top,
and then we have a Michelle Obama quote. Right, then
we have the Trump is a Russian asset quote, and
then yeah, we go into to Roger Stone and then

(29:19):
about something about a six year old being held at
a mental health facility without their parents' permission. It seems
very I don't want to say nuanced is the right word,
but it does seem like this account is kind of,
at least, if nothing else, copying and pasting from a
wider variety of sources than some of the other trolls
that we've seen. If this is indeed a troll.

Speaker 2 (29:37):
It's boring, but I think it's real.

Speaker 7 (29:39):
I'm gonna go with real.

Speaker 3 (29:40):
I also said real when I saw this, and we
were all wrong Russian from the account.

Speaker 7 (29:45):
Wow, they're getting better.

Speaker 3 (29:47):
Amy pretends to be a left leaning black woman, a
common tactic of controls. Amy's strategy is to engage with
the black community and other left leaning users, gain followers
among those users, and then use her influence to manipulate users'
conversations and push particular political agendas, so kind of a
false flag situation.

Speaker 2 (30:07):
Yeah, it's also interesting that Darren Storm fifteen sixteen, the
Russian troll House, they seem to specialize with Black American grievances,
and one that they seemed to push really hard was
someone who said that they were a black American and
he was tired of people ripping down his Trump side

(30:27):
and that basically you know that that he no one
could stop him from backing Trump. And apparently this got
a lot of responses from whites as well. So interesting.

Speaker 3 (30:36):
All right, well thank you for playing guys.

Speaker 2 (30:39):
So Mission Implausible is this?

Speaker 6 (30:41):
Are we trolls?

Speaker 2 (30:42):
People can't really see us?

Speaker 6 (30:44):
Are we real?

Speaker 2 (30:45):
What do you think, Rachel, I didn't see us.

Speaker 7 (30:47):
John's a troll on Bluse, guys. I mean he's also
a troll on Blue Guy than he was on he
was on Twitter.

Speaker 2 (30:51):
But so he's legit. But he's also a troll. But
real people cantrol.

Speaker 1 (30:56):
It's a verb anoun.

Speaker 3 (30:59):
Mission Implausible is produced by Adam Davidson, Jerry O'shay, John Seipher,
and Jonathan Stern. The associate producer is Rachel Harner. Mission
Implausible is a production of Honorable mention and abominable pictures
for iHeart Podcasts.
Advertise With Us

Hosts And Creators

Adam Davidson

Adam Davidson

John Sipher

John Sipher

Jerry O'Shea

Jerry O'Shea

Popular Podcasts

The Breakfast Club

The Breakfast Club

The World's Most Dangerous Morning Show, The Breakfast Club, With DJ Envy, Jess Hilarious, And Charlamagne Tha God!

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.