Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:03):
I'm John Cipher and I'm Jerry O'Shea. I was a
CIA officer stationed around the world in high threat posts
in Europe, Russia, and in Asia.
Speaker 2 (00:11):
And I served in Africa, Asia, Europe, the Middle East
and in war zones. We sometimes created conspiracies to deceive
our adversaries.
Speaker 1 (00:20):
Now we're going to use our expertise to deconstruct conspiracy
theories large and small.
Speaker 3 (00:25):
Could they be true?
Speaker 2 (00:27):
Or are we being manipulated?
Speaker 1 (00:28):
This is mission implausible.
Speaker 2 (00:33):
So today I'd like to welcome Darren Linnville, the co
director of Clempton University's Social Media Hub. And you study
fact based, theta driven stuff about conspiracy theories and social
media and misinformation, and.
Speaker 1 (00:49):
You study stuff stuff.
Speaker 4 (00:51):
I do feel that way sometimes, but yeah, study stuff.
We go down a lot of rabbit holes and it's
taking us somewhere different every other day.
Speaker 2 (00:58):
So so, Dar, I guess the first question what everybody
wants to know is for you to prove to us
that Joe Biden is not a robotic clone and was
it executed in twenty twenty? Can you definitively prove that
that's untrue? Right now?
Speaker 3 (01:13):
I can't prove what's.
Speaker 2 (01:18):
So I guess the question is President Trump famously a
couple of weeks ago, posted this to ten million of
his followers. He reposted it, Okay, I got it. You know,
as someone who studies the science of this, what would
be the strategy behind this? I mean, why would somebody
post something like this unless it's supposed to be funny
or sarcastic? And I didn't get the comedy in it.
(01:40):
So what's your sort of sense for posting this kind
of things that are obviously outlandish, you know, obviously a
conspiracy theory, and yet it's propagated out and popularized. And
so what would your sense be for why someone in
authority would do that?
Speaker 3 (01:54):
Yeah, I think there's a couple of reasons.
Speaker 4 (01:56):
First of all, Jerry think it's important to point out
you're not the target audience, So the fact that you
don't find it funny is completely inconsequential in the conversation,
many of his followers, even those that don't believe it
to be factual, which is presumably many or most of them,
would still find it funny.
Speaker 3 (02:14):
You know.
Speaker 4 (02:15):
The main reason he's going to post that sort of
thing is just to show that he's in on the
joke that he is part of the conversation, and being
part of that social media conversation has become increasingly important,
I think, to him and a frightening number of billionaires.
Speaker 1 (02:31):
Frankly, Yeah, we used to think people in serious positions
did serious things, but I guess entertainment is part of
it now. That goes to my question. You know, we
look at conspiracy theories, and effective disinformation seems to be
similar in some ways the sticky conspiracy theories. You know,
they have some sort of basis in truth, and they
tell the stuff that we want to believe is true.
(02:54):
They appeal to our emotions and our feelings.
Speaker 4 (02:56):
And they appeal to specific communities. They appeal to the
community that wants to believe that that thing is true, right,
and they're tailored for those communities.
Speaker 1 (03:04):
No, exactly. So it seems so much of our interest nowadays,
for example, a bro podcasts is our desire to be entertained.
We get our politics amidst these crazy fun culture stories
because just straightforward talking politics is boring. Do you see this?
Speaker 3 (03:18):
Yeah, I think that's absolutely the case.
Speaker 4 (03:19):
I don't think that that's just today though, I think
that's long been the case. I mean, if you look
at back at journalism, during the era of yellow journalism,
it was much the same, you know, it was tailored
for specific communities and it was far from factual. Things
were different when there were just a few networks that
everybody was listening to the same thing. But I think
(03:41):
that's definitely become the case now where politics has been
integrated with entertainment and we can't separate them, and only
specific communities are gonna find certain messages entertaining.
Speaker 2 (03:57):
Could you bore down on netword community? I mean, in
most communities, there's a social contract, right, we trust each other.
When when you stop somebody on the street and ask
for directions to the store, chances are they going to
tell you the truth, right, They're not going to lie you.
There is this sort of sense of social social community
and truthfulness amongst people who live in a certain place
(04:18):
or share certain ideals. But what is a community online
or people who live online or people who inform themselves
from social media. I didn't get the joke, so I'm
not in that community. But to be honestly, I'm not
really sure what that means.
Speaker 4 (04:35):
You know, I think that social media is really good
at bringing people together and specific people with specific interests,
specific ideology often sometimes specific sets of beliefs. And you
can see this in all kinds of ways. I mean,
there was no furry community to speak of before before
social media and and before the Internet and it and
(04:56):
it really helped foster both small and large groups of
people with specific beliefs and ideas and senses of humor
sometimes that are going to bring them together. And we
see this across the political spectrum. And we've seen this
exacerbated in recent years by changes in social media.
Speaker 3 (05:16):
You know, it used to be I think that.
Speaker 4 (05:17):
The idea that people were siloed in tiny little communities
was a bit overdone because when you looked at the
actual data in years gone by, people really did actually
engage with multiple viewpoints, sometimes even more so than in
their actual, you know, non digital lived experience. I live
(05:38):
in South Carolina. You know, you go to some small
towns in South Carolina and people aren't getting any large
variety of political beliefs or ideas, just like if you
go on truth social it's the same. Whereas today this
siloing has been exacerbated because so much social media has
(05:58):
been separated and bought X bought Twitter named it X
and a lot of left leaning users went to Blue Sky,
A lot of people have gone to Telegram, where everything
is based off of very you know, private, smaller groups,
and they aren't getting those outside viewpoints that they did
in the past, which.
Speaker 2 (06:19):
Brings us to cannibalism. So when I was in Germany
that online there was this big case where a guy
that people who wanted to be serious, people who wanted
to be cannibals, got online, formed their own community among
smaller community of people who had this fetish who wanted
to be eaten, and they could only actually get together
(06:42):
because of social media, and of course eventually they did
and this one individual was killed and eaten by another
individual voluntarily and it went to court as to whether
that was voluntary or not. But in the counter terrorism center,
we did talk with our foreign counterparts about, well, okay,
someone wants to commit a terrorist act, they live in
(07:02):
the middle of this small German village. How are they
going to meet somebody else, you know, to find someone
else who also wants to commit a terrorist attack. Because
whereas before and again in Germany the bodder Minehoff Gang,
they all had to get together in the university. That's
where they found their way together, but that no longer happened.
So communities can be you know, one person sitting thousands
(07:25):
of miles away from another person and getting up into all.
Speaker 1 (07:29):
Sorts of So maybe a community of terrorists who want
to use cannibalism as part of their.
Speaker 2 (07:34):
Well, that's why you wrote so well in the CIA. Joh.
Speaker 3 (07:37):
Yeah.
Speaker 4 (07:38):
And the concern here is that, you know, things that
might be mocked, or might be or might not be
normalized in broader society become quite normal in smaller communities.
I mean, goodness knows, we've seen this with the q
and On community, for instance.
Speaker 3 (07:52):
Some of their.
Speaker 4 (07:53):
Beliefs and understandings of the world would not go far
in the broader society, but when all they talk to
is each each other, those things come very normal.
Speaker 1 (08:02):
And communities the interests are sometimes good, right, I can
if I get online and I get a social media
thing that can follow the New York Knicks or in
the playoffs, or something that's to me that's interesting and
I want to hear what people are saying.
Speaker 4 (08:11):
Of course, no community is a wonderful thing. I think
we can all agree with that. But many wonderful things
have negative outliers.
Speaker 1 (08:18):
It seems that now people are starting to view that
the public spaces are polluted, and sometimes it's because officials
like Trump and others are deriding the mainstream media and
saying bad things about them, and so people are often
searching for smaller news sites other kind of places then
get news and entertainment and avoiding what they call the
mainstream media. And to me, like one of the answers
(08:40):
of getting people back to some sort of truth based
stuff is actually the mainstream media. Like the mainstream media is,
it gets things wrong, and it has a corporate interest
and all those type of things, but it does have
professionalism where people are trying to fact check things. So
my opinion is I think we actually ought to be
moving back towards the mainstream media.
Speaker 4 (08:57):
Oh I'm biased, but of course I completely agree with you.
I think that when we look at the effect of
things like state influence campaigns and the spread of malign influence, disinformation,
whatever we want to call it, over the past several years,
and the rise of the prominence and prevalence of these things,
(09:20):
one of the single biggest impacts of that, I think,
and purposefully so on the part of for instance, the Russians,
is the spread of doubt, and I think many of
us in the West and the United States especially have,
as you said, John, this tendency to just doubt what
we see. And sometimes that can be healthy, but it
(09:42):
has a really negative consequence as well, that people are
beginning to doubt the objective truth as well as the
thing that might possibly be disinformation. You know, as we're
having this conversation, the LA riots are ongoing, and just
a couple of days ago, Governor Newsom posted these images
of National guardsmen sleeping on the floor and he commented
(10:07):
on how awful this was that they were being treated
this way, and immediately users responded by saying, oh, that's fake.
Those are images from Afghanistan in twenty twenty. They asked
Grok if it was true, and Grock responded that yes,
these are images from twenty twenty. So even the technology
we rely on to give us the truth failed us,
and it was just because we're so accustomed to this
(10:30):
belief that we're being lied to. In some cases we are,
but we're so accustomed to that sort of knee jerk
reaction that doubt is always present.
Speaker 1 (10:41):
Now, all right, let's take a break, we'll be right back.
Speaker 2 (10:58):
So Jaran. You mentioned Russian there's a a huge overlap
between some of your research now and what John and
I used to do in our former days. You seem
to have done a lot of research in Storm fifteen sixteen,
and I was wondering if you could take us through
some of the nastiness that it's been involved in. I
think people may be surprised at the scope, the spectrum,
(11:22):
and the depth of what it's done.
Speaker 4 (11:24):
Yeah, Storm fifteen sixteen is a fascinating example of how
Russian disinformation has evolved and really evolved into something that
has come full circle into what it was in the past.
My lab here at Clemson, the Media Forensics, said we
identified Storm fifteen sixteen for the first time in twenty thirteen,
and then a team at Microsoft actually named it Storm
fifteen sixteen because I guess we live in a spy novel.
(11:47):
It's a narrative laundering campaign that uses a number of
different tactics. Narrative laundering is a process by which an
actor places a bit of false information. That's the first
of three steps in there blaundering. You place that information.
In the case of Storm fifteen sixteen, it's often takes
the form of a single YouTube video, for instance, and
(12:09):
then you layer that false narrative through various methods.
Speaker 3 (12:15):
It might include foreign.
Speaker 4 (12:18):
Media outlets that might place that story or write a
story on your behalf for a few dollars or rubles,
as the case may be.
Speaker 3 (12:24):
It may be.
Speaker 4 (12:27):
Some an influencer or a social media influencer that you have
in your employ or it may even be various state
actors state media. The Russians often use the social media
accounts of their embassies, for instance, to put out some
of these stories. And that's the layering process, and that
takes us to that final process, which is integration, where
(12:50):
these stories go from just coming from your state operated
voices to the general population and that and all of
this is done in such a way that each step
in the process adds a little bit of credibility to
the story, takes it farther away from that original placement,
so that it's harder and harder to tell where that
(13:11):
story came from, until it's just part of the organic conversation.
Speaker 2 (13:15):
Except dere I mean, really, where are they going to
find right wing influencers in the United States who would
propagate these stories for money? Through an intermediary?
Speaker 4 (13:24):
I can tell you where one of them lives. We
were able to track down one of the influencers. He
lives in Massachusetts, and he took one hundred dollars from
a known Russian agent to place a video of They
were really West Africans, part of the West African community
in Saint Petersburg, but they were pretending to be Haitian
(13:48):
immigrants who were brought to North Georgia to.
Speaker 3 (13:52):
Vote in the twenty twenty four election.
Speaker 4 (13:55):
Most of Storm fifteen sixteen, though, has targeted Ukraine and
specifically the Zelenski campaign. We've specifically been tracking stories that
are created from whole cloth stories like oh Olena Zelenska
spent over a million dollars at Cardier and here's a video.
(14:17):
Yeah yeah, yeah, yes, yes, yes, he bought two Yeah,
it's valued seventy five million dollars he's bought. I've lost
track of how many villas at this point. The first
villa he bought was back and he bought I'm putting
that in air quotes, was back in December of twenty
twenty three. He bought this Egyptian villa. And that story
(14:38):
they actually brought full circle because they had this fake
journalist tell the story about him buying this villa in Egypt,
and then a few months later they brought the story
back by fake murdering the fake journalist in another fake story.
And so these stories, some of them may seem ridiculous,
but they're actually very well crafted, especially to target a
(15:01):
specific audience. So let me tell you just how well
that's done. There was one video that Storm fifteen sixteen
did the day after the US election. It appeared and
went completely viral. If you were a right leaning American
on telegram or X the day after the election, you
almost surely saw this video was shared by millions and
(15:23):
millions of people, were seen by millions of millions of people,
shared by tens of thousands.
Speaker 3 (15:27):
It was a.
Speaker 4 (15:27):
Video of two supposed Ukrainian soldiers. They didn't have any
markings on them other than just a Ukrainian trident that
to supposed Ukrainian soldiers shooting at a mannequin with a
Maga T shirt on and a red hat and then
lighting it on fire. The video itself looked like something
I would have made in my backyard when I was
(15:48):
in middle school. It looked extremely sophomoric, but it was
actually really carefully crafted, not to reveal anything that they
didn't want you to see, as in where in the
world this might have been videotape, but to tell a
very concise story. That concise story being Ukraine hates Trump.
This video appeared because of the Storm fifteen sixteen campaign,
(16:10):
so we traced the roots of this video. We traced
it to a discord channel two weeks earlier, first place
it ever appeared, and it was a Russian language discord
channel pro Ukrainian discord channel where a user named Ukraine
is Life. His first post was I have a nice video.
How do I share a video? His first post ever,
(16:32):
He posts the video, then a couple of days later,
goes to a telegram channel, the telegram channel of the
seventy ninth Airborne Assault Brigade, that's the Ukrainian Brigade.
Speaker 3 (16:41):
For those of you that don't spend.
Speaker 4 (16:42):
Much time on Russian language social media, it's important to
know that Ukrainian and Russian military units all have their
own telegram channels where they get on and talk about life.
This is supposedly one of those telegram channels, but it
wasn't the only telegram channel of the seventy ninth Airborn
Assault brigade. It was the much smaller one and not
(17:03):
the one linked to from the Ukrainian government's website. So
from that page, which was clearly, you know, a false
Russian account, it went to the telegram channel of a
Ukrainian separatist politician, from there to prov to English which
is Russian state media, and then eventually to accounts run
(17:25):
by the Storm fifteen sixteen campaign, social media accounts influencers
that are in their employment and the next day it
went viral across social media, essentially the start of Russia's
lobbying campaign of the Trump administration.
Speaker 1 (17:41):
They've they've had a lot of experience with that, and
I mean you mentioned that, and you mentioned this in
the Storm fifteen sixteen, but you also mentioned and I
think it's true, Daran, is that there's a long history
of this, especially with the Russians. Right, there's terms they use,
whether we call it political warfare or subversion or active measures.
The key here is these are really the output of
large bureaucracies, are professionally run by intelligence services, and they're
(18:04):
meant to like erode our political order based on facts
and truth. And there's a long history of these and
I'm sure you've written about some of them, the Protocols
of the Elders de Zion back in the Tzaris days,
or Operation Trust, the Nuclear Winter campaign, the Nuclear Freeze campaign,
this fake story that the AIDS was created by the Pentagon.
So in some ways, you know, these are different ways
(18:26):
that going at the same thing, and it's finding a
way to find a weakness inside your enemy and exploit
that and create friction inside your enemies.
Speaker 4 (18:34):
You're absolutely right, John, they're using fundamentally the same tactics,
the same systems that they used in the past. That
the important difference is that what took them months or
sometimes even years to do in the past, they're doing
in days now.
Speaker 1 (18:48):
And to Russians and when in many ways it's a
little bit when we think of warfare, we think of
our military, and the military does warfare, but for years,
especially since the Soviet Union to Russia, the political sphere
and the information sphere is as big a part of
warfare in a certain sense as is the military. So
it's an all of government approach is something we often
talk about, but the missions have been doing it for years, decades.
Speaker 4 (19:09):
The same is true with China and many other countries
and we're behind the ball on all of these things
relative to those countries.
Speaker 2 (19:16):
And maybe we need to be behind the ball and
centrally organizing this. You need and you need authoritarian government, right,
It's hard for a dispersed democracy to do that. So
terminology is important. Malign influenced misinformation. Back in the fifties,
the CIA was involved in an information campaign that was
(19:37):
enormously successful Radio Free Europe. Right, it started it up,
and that the whole point was to be uncensored, truthful
information into the Soviet Union and the East Bloc where
everything was lies, and it was enormously successful, so successful
in fact, that CIA withdrew from it after a couple
of years and it continued to run by itself. And
(19:58):
you till just this now until now, oh god yeah.
Whereas before in the nineteen fifties, truth was the real weapon, right,
beaming it into lies, and now it seems to be
the other way. What's changed? Is it social media?
Speaker 4 (20:16):
I think social media is important, but it's not just
social media. I think a lot of technologies have made
these processes easier. It's so much easier, for instance, to
set up a web page that looks to be a
legitimate news outlet. I mean, this is done by both
people who just want to set up small news outlets startups,
(20:36):
but it's also true of foreign information operations. You know,
I've seen websites that were made by the Rani IRGC
that looked like a website that I would find interesting
and might stop and read. And the Russians did the
same thing with Storm fifteen sixteen, where they created legitimate
sounding and appearing news pages like Wether, DC Weekly or
(20:59):
the Miami Chronic. These things sound legitimate. You go to
their web page, they look professional, and they have content
that's all created by AI. It's both the ease of
digital technology. AI is facilitating a lot of these things.
We're seeing more and more troll campaigns that are operated
entirely by artificial intelligence. So social media certainly facilitates the
(21:24):
dissemination of these false stories, but it's just one cog
in a larger machine.
Speaker 5 (21:33):
Let's pause for a second.
Speaker 3 (21:34):
We'll be right back.
Speaker 1 (21:41):
What do you think the impact of so much anger
online is on our lives when we didn't seem to
have that in the or have that outlet in the past.
Speaker 4 (21:49):
For me personally, it's less time on social media. But
you're absolutely right. I mean people for some reason, and
I think that many of those reasons are actually quite
obvious or incapable of operating in the same way that
they do and the world as they do in the
digital world. I think if people treated the digital world
more like they do the real world, we'd all be
better off. You know, people in the real world, they
(22:10):
walk down the sidewalk and they know, oh, this person
is a stranger. I'm not going to give them my
I'm not going to let them in my home or
give them my phone with all my all my contact
information just because they're wearing a T shirt that I like,
or because they said something positive about the politician that
I approve of. But we do that every day on social.
Speaker 1 (22:29):
Media, or say nasty things to people who or.
Speaker 3 (22:31):
Say nasty things yeah, that they would never do in
real life.
Speaker 1 (22:34):
I see these people who are completely anonymous, just spewing
this horrible things. You're like, you know, to me, when
I grew up with that, you know, like, what kind
of tough guy are you that you're using a fake
name to attack somebody?
Speaker 4 (22:43):
That just seems that, Yeah, I think anonymity seems Lots
of research suggests that Anonymity plays an important role in
this because it just takes away any possible negative consequences
for your actions. We've seen that the anonymity that comes
with social media makes us more likely to align with
(23:04):
the group norms. Lots of research shows this of a
particular community, you surrender your own identity in favor of
the group identity. We're actually seeing increased anonymity in places
like x after Elon's takeover. More and more accounts there
are anonymous relative to what they were before.
Speaker 2 (23:23):
And for those of us who want to operate an
a fit based world something the research. You know, one
thing I hear all the time is people need to
be literate, right that you need to understand sort of
how to deal with social media. And I see you've
got a spot the troll quiz. So how do people
get on? How do people find spot patrol? And what
(23:45):
are things that people can do to become more socially
literate so that they're not we're all hoodwinked to a
certain extent, right, I mean you start reading something and
you're like, I don't know is this AI? Is this real?
Speaker 3 (23:54):
It's funny that you brought this up, Jerry.
Speaker 4 (23:56):
We made spot Patrol just before the twenty twenty action,
and it's had a lot of success. It's had well
over a million users at this point. It's an eight
minute quiz. It walks you through several different accounts and
you have to decide whether it's a real account or
a fake account. The original spot Control it was created
(24:16):
to differentiate between Russian made accounts and real people. We've
we've tested, it's actually been shown to have a positive
effect over time on the way people interact with this
with social media and made made people a little bit
more critical in their evaluation of social media accounts. It's outdated,
It's horribly outdated at this point right now, and we're
literally in the process of updating the spot Control quiz
(24:40):
as we speak. We're expanding it past just Russian trolls
to include other state operations and especially fraud fraud. Online
fraud is a huge problem, built to the tune of
billions of dollars in the United States alone every year
and something that we're hoping to try to combat. I'll
say one more thing about the spot Patrol quiz, and
(25:01):
that is that we actually we tested it, as I said,
and one particular thing that we looked at in our
test of it was age and the effective of age,
and we found that older adults, so those over the
age of sixty before taking the troll quiz, we did
a test to control group study before taking the troll
quiz were less than chance in their ability to identify
(25:25):
a real account from a Russian troll. So they were
just more likely to get it wrong no matter whether
it was a real person or a Russian troll. They
would have been better off flipping a coin. After taking
the troll quiz, we were able to get them up
to chance.
Speaker 2 (25:40):
Oh my god, our community is not doing well.
Speaker 1 (25:42):
John, Well, that was gonna be my question about age
and differences, Like are the younger generations they're more expecting
bad actors online? And it's funny, So my son just
recently came through. He's in university and he started to
give his credit card information to a person who texted
him about some sort of problems with this Department of
Transportation and as tolls or something, and like we were like,
(26:05):
what are you doing? That's clearly a fake thing. You
don't have a car, but why are you answering this thing?
I thought they were like much smarter than we were
on that.
Speaker 4 (26:13):
I think what we're starting to find is actually just
that they're susceptible to different types of things. Yeah, and
we just don't talk about it as much because they
don't have as much money and so they're not they're
not losing as much.
Speaker 1 (26:24):
The scam thing is a huge, obviously huge deal. You know,
the disinformation misinformation is is hurting our body politic, but
the scam fraud part of this is doing complete damage
to families and between generations. And yeah, it's a real problem.
Speaker 4 (26:39):
And it's a huge part of the economy in other countries.
It's as much as I've read a third of the
GDP of North Korea just the systematized process of defrauding
other the citizens of other nations.
Speaker 2 (26:52):
There's you know, there's a social contract within societies, within communities,
right that allows us to function. You assume the other
guy is going to stop when there's a stop sign
when you're driving. I mean, there's a sense of trust.
And yet, coming back to the first question about why
Trump would post about retweet something about Hunter Biden being
a robotic clone, is that I'm going to misquote Hannah
(27:14):
Aren't here, but it's not to get people to believe things,
but get them to believe nothing, because if you believe nothing,
if you're suspicious of everything and of everyone, we become weaker.
Our society is weaker, our institutions become weaker. And do
you get a sense that the Russians are looking to
(27:35):
you know that that's part of it, is to get
us to believe nothing.
Speaker 3 (27:38):
I think that's absolutely a huge part of it.
Speaker 4 (27:40):
It's certainly part of it in the ways in which
they target their own people. The biggest targets of influence
operations have been and continue to be the citizens of
autocratic nations around the world. And we've definitely seen the
way that Putin spreads his fireus hose of falsehoods and
exits owned people. Because if you don't believe anything, you're
(28:03):
certainly not gonna believe anything strong enough to decide to
go against the nation's leader and try to do something
dangerous like revolt that takes belief. And so it's much
safer for these autocrats to keep their.
Speaker 3 (28:18):
People believing nothing.
Speaker 4 (28:20):
You certainly don't want them thinking the grass is greener
on the other side of the fence, Yes, And that's
another reason they target the West. They target the West
with these accusations of corruption and that we're decadent and
weak because in large part who they're actually targeting is
their own people.
Speaker 1 (28:40):
So they can tell their people that, oh, look, they're
no different than we are. It's bad everywhere, right. You know,
if you think the West is something that you should
be looking up to, that's a mistake. Can we trust
our federal government now as purveyors of information? I mean,
our economic statistics are that all of our business relies on,
and our economy comes from the US government, the FBI,
(29:01):
the Justice Department, that the Director of National Intelligence, Homeland Security,
looking at federal elections, all of these CC government information,
and it seems like we have an administration who wants
to put out information that just helps them as opposed
to fact based information. Do you see any concern there?
Speaker 3 (29:21):
Concern? Yes?
Speaker 4 (29:22):
I think in part it depends on what you're talking
about when you talk about the government. If we're talking
about individual politicians, have we ever been able to trust
individual politicians? Is that any different than it was in
the past. But if you're talking about institutions, I guess
I'm more concerned about what they're allowed to say or
what they are empowered to say than I am about
(29:43):
them actually saying things that are factually incorrect. These institutions
are still operated by bureaucrats, you know, largely a political bureaucrats.
Maybe at the top that's not the case, but institutionally,
I think it still is. And I think that the
real problem is going to be when the people within
(30:05):
those institutions just don't feel empowered to say what they
think they need to say.
Speaker 2 (30:10):
We've been going down the road talking mostly about politics,
but I wonder if you could talk about health as well.
The fact that RFK Junior just fired all the oversight
from the CDC right independent scientists and medical practitioners who
decide on vaccine safety for example, right, and they the
entire board was fired, and he brought on the people
(30:33):
that he wants on that most of them I think
are not independent, right, they're sort of his people. I
was wondering if you can give you a sense of
what the difference is if you see any between misinformation
on politics and misinformation on something that people really concerned
about the health of their children and medical information.
Speaker 4 (30:54):
I think it's obviously different. I think that the potential
harms are far more palpable in the latter in the
case of public health. Obviously we've seen this to be
the case in Texas with muss outbreaks.
Speaker 2 (31:08):
You know, the Russians involved in this at all or
is this not something they care about so much?
Speaker 4 (31:12):
They have been involved in these sorts of conspiracy theories
in the past, mostly piggybacking in order to integrate into
particular communities, to appear part of a community.
Speaker 3 (31:23):
They've used some of this messaging during COVID. There was
a lot of talk about Russian disinformation around COVID, and
there was some you know, they did try to denigrate
other vaccines, for instance, in favor of their own, but
it was still relatively limited because I think even Russia
understood that they were part of about global community and
really this was something that needed to be beaten globally
(31:46):
and if it spread one place, it was going to
spread everywhere. But I think overall what we're seeing is
still regarding distrust of science, distrust of the medical community,
as just the effect of the COVID pandemic. You know,
people got mad at the nerds, they got mad at academia,
and they're still mad at academia and medical professionals, and
(32:08):
they're taking it out on us.
Speaker 1 (32:10):
Now.
Speaker 5 (32:11):
There's a lot more to talk about on this subject,
which we will next week on part two of our
conversation with Darren Linnville. Mission Implausible is produced by Adam Davidson,
Jerry O'Shea, John Seipher, and Jonathan Stern. The associate producer
is Rachel Harner. Mission Implausible is a production of honorable
(32:31):
mention and abominable pictures for iHeart Podcasts.