All Episodes

November 16, 2022 31 mins

In Part Seven of LikeWar, we take a look at the bigger picture to answer the next question facing us all. What can we - individuals, politicians, corporations and civil servants - do to stop LikeWar before it’s too late?

This episode features the expertise of Lisa Guernsey, the director of the Teaching, Learning, & Tech program at New America and Jimmeka Anderson, a media literacy educator at New America. 

This series is adapted from the book LikeWar, written by series narrator Peter Singer and series contributor Emerson Brooking. To learn more about their research and defense work, you can find them on Twitter @peterwsinger and @etbrooking.

Get the book at LikeWarBook.com.

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
H m hmmm. So we started this series by talking

(00:39):
about the rise of ISIS and how they use social
media as their ally. But as we're recording, the Taliban
completely overtook Afghanistan. As it stands this hour, Taliban fighters
surround the capital Carbal and negotiations are underweight to secure
a transfer of power. Afghanistan's has that then fled the

(01:01):
country and much of the population is in hiding afraid
of what's next. In the eastern city of Jelalaba, Taliban
members five shots in the air to disperse the crowd,
and alongside their rifles, they've been carrying smartphones. They've been
using social media and to speak to Western observers to

(01:23):
try to project a more modern image. The Taliban has
been banned from networks like Facebook and YouTube, but over
the years they have still found ways to refine their
image and spread their messages on Western social media platforms.
The group used this technology to build political momentum that
aided in the takeover of Afghanistan. But if you go

(01:44):
back and look through the early history, the Taliban were
on the internet in two thousand five, sending out perfectly
literate English language announcements almost every day. To the international press.
Let's go live now to do hard speak to so
Hale Shaheen, the Taliban's official spokesperson for international media. By

(02:04):
two thousand nine, they were on YouTube, by they were
on Twitter, the Talban were on both WhatsApp and Telegram.
They'd moved to the same encrypted messaging platforms as everybody else.
And then by the sorts of videos and images that
the Talban were posting were indistinguishable from the sophistication of

(02:26):
the propaganda of the Islamic state. These glossy, high definition videos,
sometimes shot by drones, of firefights, were of suicide attacks
on US basis. And then when the Taliban began to

(02:48):
negotiate a settlement with the Afghan government in they gained
even more legitimacy. They were on these social media platforms.
They began to court Western media. They published an op
ed in the New York Time Times. They really cast
themselves as a legitimate alternative to the Afghan government and
in Cobo Fell and their use of social media, while

(03:10):
not the foundation of their victory, played a huge part
in it. I'm Peter Singer. That was my co author,
Emerson Brooking. Over the last six weeks we've told the
real life stories of Like War. The Taliban's takeover of
Afghanistan is just one more reminder of the devastating consequences

(03:32):
that happened when we're unprepared for how these online battles
play out in real life. Now, it's our job to
answer the most important question facing us all. How do
we fix it before it's too late? This is Like
War Part seven? What's next? So? What can we do

(03:57):
about the problems that so trouble us on line? The
President is weighing in on a story we've been following
this week, fake news that popped up on social media
during the election season and may have influenced some voters.
As misinformation and so called fake news continues to be
rapidly distributed on the Internet, our reality has become increasingly
shaped by false information. Fake news can have real human

(04:20):
consequences when it spreads from the online world to the
offline world. There's no one, single silver bullet solution to
this problem, but we can manage it in a much
better way through a more comprehensive strategy that involves us all.
There's a layered approach to all of this. So when

(04:43):
we're talking about the challenges of like war, whether they're
striking in terms of public health, and there are allegations
tonight that Russia has launched another disinformation campaign, this one
to undermine global confidence and a potential COVID vaccine or
challenge to our democracy. There's three levels of action, government, industry,

(05:11):
and individuals. When it comes to the role of government,
we do not have a comprehensive strategy for the challenges
of like war. We have a variety of pockets of
activity that have grown in recent years. So you've got
programs dealing with, for example, Russia. In the days other news,
Russia's official r i A news agency reported that the

(05:32):
US and Russia are talking about creating a cyber security
working group. President Trump had raised a similar idea during
the G twenty summit, but backed off under heavy criticism.
This latest report comes amid multiple US investigations of Russian
meddling in the election. We have other aspects that are

(05:54):
dealing with, for example, election security. Today, it's my great
honor to sign the Cybersecurity Infrastructure Security Agency Act into law.
So as the cyber battle space evolves, this new agency
will ensure that we confront the full range of threats
from nation states, cyber criminals, and other malicious actors, of

(06:16):
which there are many, and so what you need on
the government side is a comprehensive approach that brings together
all the different arms of government. The US can and
should look to other countries who have endured the same
attacks and made larger systemic changes. Estonia faced a major

(06:38):
crisis in two thousand seven when it became the first
country to experience a massive cyber attack, which took down
Estonia's email bank and newspaper servers. The state portal, the
president sportal, the buying sportals, the newspaper sportal. They were jammed. Basically,
they were not available anymore. And this created a lot
of panic. And the assumption is that our neighbors who

(07:00):
were responsible for this other Russia. When Estonia was attacked
by Russia, the nation immediately took steps to combat the
threat and is now one of the most cyber secure
democracies in the world. The country came together, the government
went public describing the attack and the steps it was
taken to thwart it. The next year, the Cooperative Cyber

(07:22):
Defense Center of Excellence, a multinationally funded think tank, opened
in talent. Here, they trained military and civilian experts from
twenty one countries how to protect against cyber attacks on
government systems, banks and utilities. As yet another line of
defense against cyber attack, the country is creating a backup
system what they call digital embassies, in which estonia Store

(07:44):
is a backup copy of all its digital assets in
another country. Estonia's collaboration with other nations is one of
the responses that our own nation should take inspiration from.
We have to ask our government, when so many other
nations have been the target of Russian cyber attacks, why
didn't we come together with them. One of the keys

(08:08):
to combating these future threats is to work with these
other democracies, to share best practices and to respond to
cyber attacks as a united front. If you hit one
of our elections, all of us are going to respond.
So to fight this ongoing online threat, we need to
bring together the different arms of our own government. We

(08:31):
needed to create a comprehensive plan with foreign governments, and
we also need an effective way of keeping the Internet
and the miss and disinformation that comes with it in check.
A lot of governing bodies around the world are starting
to say, look, we created our government before there was
an internet, so we have no one to govern the Internet.
Maybe we should create an Internet regulator. The EU was

(08:52):
talking about it, Australia is talking about having an algorithm
review board. Here in the US, we are so far
from having that conver station, but hopefully it's one that
we're going to start to have because quite frankly, we
are in trouble here and so that's why you go
after it on the governmental level. But even if government
did everything it could and did everything it could perfectly,

(09:16):
that still would not be enough. There is also an
important role for the private sector and the individual. Do
you see a potential problem here with a complete lack
of fact checking on political advertisements? Well, Congresswoman, I think
lying is bad, and I think if you were to
run an ad that had a live that would be bad.
So you won't take down lies or you will take

(09:38):
down lines. I think it's just a pretty simple yes
or now. In a democracy, I believe that people should
be able to see for themselves what politicians that they
may or may not vote for, you won't take They
conceive of themselves as technology companies, as creators of technology.
They are that, but they are also something else. They

(10:00):
are now media companies, and they are in fact the
most powerful media companies and all of human history. They
may not like that fact. It may not square with
the vision of who they are and how they were
trained in their location and Silicon Valley and all that identity,
but that is the reality of who they are now,
and that brings a very different set of responsibilities, and

(10:22):
we see them kind of coming to grips without slowly.
They have a responsibility. They are a media company of
the of the modern age, and so they have to
start behaving a little bit more like that. Facebook's algorithms
now review little known websites whose articles get sudden surges
of traffic, a classic red flag for misinformation and clickbait.

(10:42):
This week, the social media giant announced that it deleted
eight hundred and sixty five million posts in the first
three months of this year. Most of it was spam,
but that isn't everything. The company has expanded it's fact
checking efforts. If you happen to watch a video about
NASA confirming that Earth would go dark for several days

(11:03):
because of a solar storm, Facebook wants you to know
that it's a hoax and it's trying to remove this
kind of content from its platform. So far, Facebook says
it's partnered with nearly thirty third party fact checking organizations
all around the world. It's added features to Messenger to
help users determine if they're interacting with real accounts or impersonators.

(11:25):
And then finally they begin to deep platform go after
including high profile individuals who had been consistently the super
spreaders of this malicious information. Twitter permanently banning the Commander
in Chief's personal account with eighty eight million followers after

(11:46):
an initial twelve hour lockout following Wednesday's riots at the
US Capitol. This action also includes slowing the spread of
misinformation by limiting who can share information or not. Here's
one example. Say you have people in a group who
repeatedly share information that's been rated false by fact checkers.

(12:06):
In that case, the company can reduce the reach of
the group as a whole by cutting down on the
number of Facebook users that the algorithm suggests should join
the group. These are all important changes that would have
been unthinkable a mere decade ago, but they still are
largely reactive to the problems. That is, they wait for

(12:29):
the bad thing to happen see if there is enough
outcry about it, and then respond and we can see
this pattern. It's it's a thread that runs through literally
the start of this space, whether it was the early
days of child porn to terrorists use in beheading to
violence on January six to anti vax or conspiracy theory.

(12:52):
Each one of these problems was apparent, was warned about,
was known about, and then after it plays out, the
companies tend to take more action that if they had
acted previously, we would not have seen some of the
more terrible real world effects of it. So essentially it

(13:14):
comes down to the companies recognizing that they play a
very different role in our world than how they originally
conceived of themselves, and that brings a very different set
of responsibilities. So when we talk about the challenge of
disinformation and how to stop it, we talk a lot

(13:35):
about what governments are doing or what the tech companies
are doing. But there's another player here, and that's civil
society organizations. Now I'm fortunate to be a resident Senior
Fellow at the Digital Forensic Research Lab of the Atlantic Council,
and we were founded back in sixteen during the Syrian
Civil War and especially as Russian troops engaged in that conflict.

(14:00):
After the revelations of Russian interference in we were right
there to study um those patterns as well. One of
the big changes we've tracked since is how disinformation and
social media manipulation have been outsourced. They're not so much

(14:21):
the province of governments anymore. Instead, there are basically marketing firms, contractors,
digital mercenaries who carry out this work all around the world.
In Indonesia, there is an industry of people who run
fake accounts for profit. They are called buzzes. Iqbal a

(14:42):
buzzer explains to us how it works. Clients pay him
to promote content with these accounts to appear authentic. These
accounts will post ordinary things tailored to fit the profiles.
Sona his fake accounts are much harder for Facebook all

(15:04):
Twitter's algorithms to detect because there is a human behind them.
And the trouble is that even as tech companies have
grown a lot more aware of campaigns targeting the West
in the English speaking world, there's still huge gaps in
their knowledge in their monitoring, and extremely primitive campaigns that

(15:28):
target Nigeria or Western Indonesia received very little attention and
a lot of these campaigns would go unidentified if it
wasn't for groups like ours who can bring them to
the tech company's attention, to the attention of government regulators
so action can be taken against them. In addition to

(15:50):
what government and private industry and civil society can do,
there's still the role of the individual, and the role
of the individual matters for all the reasons, but the
most important is that they are the target of thinking attackers.
If you change the legal code, if you change the
software code, a thinking attacker, whether they are Russian disinformation warrior,

(16:16):
domestic extremist, anti vax or conspiracy theory, whatever it is,
they'll change their approach. They'll work their way around it,
so the target will still be hit. So we as
individuals need the skills to manage our own way through
this world. It's also about having a sense of ethics,

(16:39):
a sense of responsibilities for others. So think about, for example,
covering your mouth, whether it was you know, in the
olden days, covering your mouth when you cough, to now
covering your mouth with a mask. It's not only about
protecting yourself, it's about protecting everyone else that you come
into contact with. It's the same thing. And like war,
it's not just about what you digest when it comes

(17:04):
to information and miss and disinformations, it's about what you
share with others. That part of the responsibility, that part
of the ethic has been largely missing, particularly in the
United States. Here again, we can learn lessons from other
nations that handled us a lot better than us. What
the Estonia's the finlands of the world that they've not

(17:27):
broken up Facebook, they've not changed Russian information warfare, but
they've managed it far better than us because they have
built in everything from changes and how they identify incoming
information threats within their government to how they teach students
in their schools to handle this space. And what they

(17:47):
teach is a concept that we call cyber citizenship. What
cyber citizenship is is a bringing together of three concepts.
The critical thinking skill of what we call media or
digital literacy. When I'm exposed to this information, how do
I ingest it? How do I understand it? The second

(18:09):
thing that it brings together is the concept of what's
been sometimes called digital civics. It's not just about critical thinking,
it's about a sense of responsibility. The third area that
brings together is classic cyber security understanding that there are
threats out there, active actors that are trying to target

(18:30):
and manipulate you, and these are their tactics. And so
cyber citizenship brings together those three elements, those critical thinking skills,
those set of responsibilities, that awareness of threat. The first
step is making those tools accessible to the people who
need the most. Teachers. To make that a reality, the

(18:51):
think tank that I work at, New America has partnered
with cyber Florida, the Florida State Education System cyber security program.
We've brought together the National Association of Media Literacy Educators,
Science and Technology Teachers, Associations, you name it, we're bringing
them together. But most importantly, we're building a portal that

(19:12):
is the first ever gathering of the wide variety of
teaching tools for this so that finally a teacher will
have a place they can go to and find exactly
what they need for their students. And most importantly, I
can see what other teachers are saying about those tools
and how they found them useful, and here's how I
built them into my curriculum, so that we're learning from

(19:33):
each other. My colleagues at New America Lisa Guernsey and
Jamaica Anderson are here to explain exactly how this portal works.
It's my name is Lisa Gernzy, and I direct the Teaching,
Learning and Tech team at New America and I've been
working on the cyber Citizenship project with Jamaica and I'm

(19:54):
Jamaica Anderson. I am a media literacy educator. We came
up with this concept, Hey, let's have this portal where
there's resources that these educators and librarians don't have to
search all through Google all online to locate, but they're
located in this kind of one stop hub where they
can filter through and find what they need based on lessons,

(20:16):
based on their subject matter, based on the age group
that they serve, to provide that education to the students
on misinformation, and we have curated over ninety resources online,
such as videos that can explain to you what is misinformation.
There are certain skills associated with helping teach kids to

(20:40):
identify misinformation, such as lateral reading. When you're on a
new website, instead of staying put and taking their word
for it, you should open a new tab and start
looking for more information. That's called lateral reading. It's lateral
because instead of moving up and down, you're moving from
tab to tab. There's videos and content on that. But

(21:02):
then there's also assessments their handouts that teachers can utilize
and incorporate in their classroom lessons. There's quizzes on there,
so it says how likely you are to be able
to identify misinformational online. And I'll just add there that
one of the things that we're really happy about with
this hub that we've built is that you can search

(21:24):
and filter based on grade level um as well as
kind of based on the material type that you might
be looking for. So depending on the age of the
students in your classroom or the kids in your household,
you can look and see, Okay, is there's something that's
better for elementary school kids? And could I find something
that's maybe a game for elementary school kids. So you

(21:45):
can using drop down menus filter to get more specific
about the kind of resource that you might want to
use with your with your students or with your children,
and that can help educators can navigate the sea of
information that's out there right out A lot of people,
a lot of publishers are developing some really cool tools
that get us to be more informed citizens and allow

(22:08):
us to build resilience. But finding all those tools is
the big challenge, so we're really hoping this pub is
going to help with that problem. One of those cool
tools is a lesson to figure out just how good
you are at judging if online information is real or not.
It's quite simple, but it's honestly, it's like a lesson
plan that revolves around a photograph that's been doctored, that's

(22:31):
been manipulated to make you think it's showing you one thing,
but actually it's showing you something else. So in this
particular case, it's a photograph of a pig fish. It's
literally like a you know, been photoshops so that there's
a pig face on a fish's body. And you could
use you could put this image up, say if you're
in a classroom with teacher with a projector or whatever,

(22:53):
you could put this up on the screen, or you
could even use this as in a conversation with your
kids at home, and you can start asking questions about
this image, like do you think this is a real image,
and why do you think this image was created? And
how how was it created? And who wanted you to
think that this this pig fish was a real fish,
and just it generates a whole line of inquiry just

(23:16):
by being able to look at this, and then we're
hoping helps people to develop those kind of skills and
habits of mind so that the next time they might
see an image that's maybe posted on Instagram or that's
running around Facebook, might say, hmmm, is that image really
valid and verifiable? How? How can I find out? In
this case, they're helping teachers to point out that you

(23:39):
could use fact checking sites like Snopes, for example, UM
to help UM students see that, oh, this pig fish
image was in fact fake, and I can go to
snopes dot com and I can see what it was
that helped to create this image in the first place
and why it's fake. Teaching students skills like this from

(24:01):
a young age is absolutely necessary if we want to
efficiently combat like war in the future. But building a
resource portal is just one step, and there are multiple
other challenges to tackle. The first is, you might have heard,
but Florida is only one out of fifty states. We
need to do more of this on the national level.
The second is you might have heard, but the United

(24:22):
States is only one out of many of those different
democracies out there. Can we build global versions of this
so that not just the Estonians of the world have
these tool kits, but can we provide them to these
other nations that are increasingly being targeted the teachers there.
And so one of the other issues in the United
States is that it's not just an issue of making

(24:43):
these tools available. You have to alter what are the
standards expected to be taught in our schools. And we've
always altered our standards. Back in the day they taught
kids reading, writing, and arithmetic and animal husbandry, um, because
that's what you needed in the nineteen hundreds. Well, it's
different today, and so you're constantly changing your standards and

(25:04):
updating them. And we need to update our standards and
that will also allow teachers to teach towards this. And
I will also just add that understanding that a lot
of people in the media literacy community for many, many
years have really felt that there should be an investment
in the educational world. There's organizations such as Media Literacy

(25:27):
Now where they have chapter leaders in each state that
are trying to lobby and push for media literacy to
be added in their states as a requirement in schools.
Some states have had some success, but many states have
not at this time. So I think that there's a
challenge on the kind of policy side of things with

(25:49):
actually seeing that need or investment, but being very hopeful.
As within the last I would say a couple of years,
I think that a lot of shifts have changed the
value and need for media and digital literacy. Also, some
of the states, such as in North Carolina, they have
placed it under the role of like the librarian and

(26:10):
media specialists in the schools to embed some of those skills.
But also there are schools, some schools that don't have
media specialists, and so there's that challenge to ensure there's
equity in place, that everyone is getting access to understanding
these skills and having that instruction and information. And if
it's not a part of the curriculum, then it may

(26:32):
not be taught, or it may not be considered a
priority to teach by some teachers or some schools. Education
is key for combating future examples of like war, but
it's important to note that there are also paths for
those of us who are no longer in school. I
would say, no matter your age, it's never too late

(26:53):
to get involved in information literacy, in learning to use
the internet, and this this incredible online ecosystem more to
your advantage to be less vulnerable to disinformation and people
who would take advantage of you online. One extremely underutilized

(27:13):
resource in the modern age is the public library, which
still exists, believe it or not, in a lot of towns.
There are people whose job it is to categorize and
navigate massive amounts of information. And if you just want
to learn how to improve your own research methods, how

(27:34):
to use the internet better, starting at your local library
with these professionals whose job it literally is to help
you do that, that's not a bad way to start.
It is extremely important for us to learn cyber citizenship
skills because the Internet and the online world is a
part of our reality now. It's the world in which

(27:55):
we live, and in order to navigate in this space,
we have to understand it. We have to understand the messaging,
the content. But most importantly, we have to understand how
if we don't understand and don't have those skills, how
it changes our beliefs, and it changes and our beliefs
changes our actions, and it impacts our health and impacts

(28:18):
our safety. And so that's why it's very key because
when now we're living in a situation where our health
and our and our safety is at risk, I think
that that's a need for an investment on a political front,
but also an educational front um and individually to understand
why it's important for us to pursue having those skills

(28:41):
as well. What's so crucial to recognize right now is
that education policy and national security policy and health policy
are all wrapped together. If you don't deal with like war,

(29:01):
you will not be effective against public health issues like
the pandemic. The reality is this anti vax, anti science
segment that so often was relegated to the fringes of
our society has grown, and we will not defeat COVID
nineteen if we don't beat back misinformation that is taking
hold and it's costing people their lives. You will not

(29:22):
be effective if you deal with climate change. If so
much of the climate change debate is warped by this phenomena,
climate change is easily targeted for false news reports because
it's somewhat difficult to explain. George Mason University released a
study saying that only fifteen percent of people understand that
human activity has been the main cause of global warming,

(29:44):
but scientists agree on this. If we don't teach those
skills of cyber citizenship, critical thinking, sense of ethics, online,
awareness of threats on law line, we will not be
able to protect our nation, protect our democracy, protect our

(30:09):
public health. So they're all ropped together now to find
out more about how you and the next generation can
start to battle like War. The cyber Citizenship Portal is
available to everyone. You can find it at cyber Citizenship
education dot org. This is a production of iHeart Podcasts,

(30:44):
Graphic Audio and Goat Rodeo Caro. Schillen That's Me is
the series lead producer. This episode is just one of
a seven part series. Find other episodes wherever you get
your podcasts. If you'd like to dive deeper into the
work of P. W. Singer and Emerson Brooking, you can
access the full audio book Like War, on which this

(31:07):
series is based wherever you get your audio books. Writing
and editing from Kara's Chillen production assistants from Isabelle Kirby McGowan.
Senior producers are Ian en Wright and Megan Nadowski. Please
share this series with the hashtag like war to find

(31:28):
other conversations about the series. Thank you for listening.
Advertise With Us

Popular Podcasts

Dateline NBC
Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

The Nikki Glaser Podcast

The Nikki Glaser Podcast

Every week comedian and infamous roaster Nikki Glaser provides a fun, fast-paced, and brutally honest look into current pop-culture and her own personal life.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.