All Episodes

December 2, 2021 46 mins

Can we inoculate ourselves against misinformation and conspiracy theories in the way we do for infectious diseases? Instead of debunking, can we “pre-bunk?” Sander van der Linden, co-founder of Inoculation Science, has created games that offer to do just that. Baratunde plays one of them and speaks with Sander about online misinformation campaigns, polarization, and how we can better protect ourselves.


Guest: Sander van der Linden

Bio: Professor of Social Psychology in Society at the University of Cambridge, co-founder Inoculation Science, author of The Truth Vaccine (writing)  

Online: Inoculation Science website; Sander’s website and Twitter @Sander_vdLinden


Show Notes + Links

Go to howtocitizen.com to sign up for show news, AND (coming soon!) to start your How to Citizen Practice.

Please show your support for the show in the form of a review and rating. It makes a huge difference with the algorithmic overlords!

We are grateful to Sander for joining us! Follow Sander at @Sander_vdLinden on Twitter, or find more of his work at inoculation.science


ACTIONS


- PERSONALLY REFLECT

Reflect on the game. 

After you play the game at https://inoculation.science and watch a few videos, reflect on how they made you feel. Are there online experiences you’ve had that make more sense once you consider you might have been intentionally manipulated? How do you think these games will affect your future online experiences?

 

- BECOME INFORMED

Play the game. 

Point your browser over to https://inoculation.science and play their set of inoculation games. In addition to Breaking Harmony Square, which we featured in this episode, they offer games to help you limit the harm of fake news and COVID misinformation. 

 

- PUBLICLY PARTICIPATE

Share the game. 

Finally, share the games with people you care about. Friends don’t let friends spread misinformation. 

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:02):
Welcome to How to Citizen with Baritune Day, a podcast
that reimagine citizen as a verb, not a legal status.
This season is all about tech and how it can
bring us together instead of tearing us apart. We're bringing
you the people using technology for so much more than
revenue and user growth. They're using it to help us
citizen Alright, play let's destroy society. Congratulations, you're hired. Welcome

(00:37):
to your first day as our new chief disinformation officer.
Let's get started. We hired you two, So discord and
chaos on Harmony Square. That's what's up. That's what I'm
here for. Who are we about to flog up? So
right now I'm playing Harmony Square. Normally this game doesn't

(00:57):
have music or sound, but my team took some creative
liberties to bring you into the experience with me, and
clearly I'm having way too much fun about to sew
some chaos. Harmony Square is a green and pleasant place.

(01:18):
It's famous for its living statue, it's majestic Palm Swan,
and its annual pineapple Pizza festival. Any place that does
a pineapple pizza festival deserves some discord. Yeah, pineapple pizza
that's disgusted. Now, the goal of this game is to
disturb this hypothetical quaint small towns piece and quiet by

(01:43):
fomenting internal divisions and pitting its residence against each other.
I am oddly excited about the prospect. There are no
bears here, never have been. But how many Square loves
elections so much they keep voting for a bear patroller? Anyway,
one of the ways you get to divide this absurd
little town is through an election. It's for bear patrollers,
and there's only one candidate, Ashley Plute. What kind of

(02:07):
language do you think is most likely to ruin the
bear patroller election? Tears and fears or facts and logic
tears and fears. So as I'm going through the game,
I'm getting presented with these choices for how I intervene,
and I got real excited about the chance to make
fake news means. Since you're so clever, why don't you

(02:27):
choose some electrifying buzzwords to including your mean. I get
to pick three buzzwords, O, corrupt, abuse, and lie. Yeah,
we're taking Plute down. Now you can put together an
emotionally abusive meaning. So I decided to go with this
meet it shows that too white dudes handshaking above the table,

(02:50):
but underneath, money's changing hands unopposed. It's just a fancy
word for corrupt. Careful though you posted some content that
wasn't emotionally exploitative cost you a couple of lives, the
megaphone will be much more successful if you use the
right buzzwords. Damn alright, coach, Yeah, this is mad devious,

(03:11):
like this is This is by far the best thing
I've seen that explains this. Like, I'm bright, Bart, you
know this is wonderful, and by wonderful I mean terrible.
This is a really twisted game because I thought I
was playing it aggressively by choosing to disparage a newscaster

(03:33):
on a small scale, you know, talking trash to a
friend or a family member. It turns out the game
is like, that's not devious enough. You must scale your deception,
and so it encouraged me to kind of create a
more public platform for the disinformation. Hell with responsibility. That's
crack it up to eleven. I'm trying to destroy the time.

(03:53):
Like this game has really got me. Why they even
call it harmony square? I think that's what bothers me.
I want to make it Discord Square. Let's go. Can
we get some tiny violin music up in here? You
did it. You ruined the biggest moment in Harmony squares history.
They're all at each other's throats now, okay, in the end,

(04:17):
let's see, still counting, you have reached fifty six thousand
four followers. You did better than eight of the people

(04:38):
who knew simulating the emotional destruction of a small town
could be so much fun and that I'd be so
good at it. I mean, it's fun, but it's also scary. Now.
Harmony Square is a fictional place obsessed with democracy electing
bear patrollers, But other than the bear patrol thing, it's
pretty similar to the world we live in right now.

(05:00):
The election has cast a light on misinformation online, and
it's fust election stories from foak sites and dangers of
fake news. Ever since the presidential election, our internet has
become increasingly more divisive, with fake news spreading like wildfire.
And combating all this misinformation that can feel like you're

(05:20):
playing whack a mole. You take one down, three more
pop up, but in this case, millions more pop up,
whether it's misleading headlines, divisive memes, or trolls and these
little rodents they're borrowing into the fabric of our society.
It's kind of maddening and admittedly disgusting with the whole

(05:41):
rodent metaphor. After a few years into this fake news ecosystem,
social scientists are seeing how easily we can fall prey
to some of our basest human instincts, and fear is
a hell of a drug and a terrible motivator. As
we click, shape, air, and retweet about the pandemic racial injustice,

(06:04):
climate change, and more, many of us have become aware
of how deep and dark the pit of social media
can be, but few of us have had the chance
to drive that descent like I got to do in
the fake world of Harmony Square. But Harmony Square isn't
just reflecting real world problems in a cute way. It's

(06:25):
helping us battle them too. Yes, a game can help
us fight misinformation and disinformation. Believe it or not, this
game was preparing me to fight off trolls by getting
inside their heads. In other words, it's like a fake
news vaccine. Sander vander Linden is one of the people

(06:51):
behind Harmony Square. He's a professor of psychology, at the
University of Cambridge, studying how people are influenced by social
media and misinformation. His team of researchers have partnered with
game developers to create several games just like Harmony Square.
They called themselves Get This Bad News Games. Essentially, these

(07:12):
are choose your own adventure games, all free to play online,
and there are a real way that science can go
out and reach people. But even he admits it's an
ongoing struggle. If I told you that down your street
and went to one of the restaurants, and you know,
I got food poisoning was real bad. A week later,
I tell you, oh, look listen, actually it wasn't that restaurant,
it was another one. Every time you're going to pass

(07:34):
by that restaurant, you've gotta thank food poisoning. After the break,
sanders prescription for our social media ills and a lesson
in Dutch salutations, what's up, Sander, Welcome to how to

(07:56):
Citizen Pleasure to be on the show Dog. That's close.
That's close. Yeah, correct, my terrible Dutch with the proper
pronunciation hudnd Okay, Wow, that's good. That's good. Yeah. Yeah,
there's a you know, people say, oh, you know, Dutch
and Germans pretty much the same, but but we feel
strongly that there's a nuance nuance difference there. Your field

(08:16):
of specialty is timely and it's fascinating about human decision
making and influence and judgment and communication around all those things.
Can you break down how you describe what you research?
You know, at a very basic level, I try to
research how people are influenced by information, how people are

(08:38):
persuaded by ideas and information. And what I've become really
interested in the last few years is how we can
help people resist and detect attempts to manipulate us online
but also offline when it comes to you know, fake
news and misinformation, disinformation, all of those things. But also
study you know the nature of how information spreads on

(09:01):
social media and how that influences people and what is
happening on these platforms. Where are people engaging in flame wars?
And you know, why do we see polarization and social
media a good thing or a bad thing? So very
difficult and complex questions. When did you start down this path?
I started out pretty late. I didn't necessarily come from
a academic oriented family, and so I got a job.

(09:23):
I thought that's the thing you need to do. You
need to make some money and get a job going
to the real world. And so that's what I did,
And I actually ended up working at a bank and
had a kind of a crisis in terms of what
I was doing with my life, and so I decided
to quit that job go back to school. And that's
how I got into academia. And I think I was
lucky that I got to work in a few jobs
that I really didn't like. So I never looked back

(09:45):
in terms of my own experience because what I got
to do now, experiment on people is fun. Even when
I was little, I loved experimenting on people and learning
about how they react. And I would you know, I
had set up elaborate schemes to see what people would do,
just because of my curiosity and in human behavior. I
hear you using the term misinformation and occasionally disinformation, and

(10:06):
out in the wild, these terms are often used together.
Sometimes there's a slash between them. Sometimes people use them interchangeably,
even if they might not intend to. So for the
record and for clarity, what is misinformation? What is disinformation?
How are they different? That's a great question. I've defined
misinformation as in a lot of the work, we're doing

(10:27):
information that is simply false or incorrect, and so this
can include things like simple journalistic errors, but it doesn't
tell you whether it's misinforming people by accident or intentionally.
And so, for me, disinformation is misinformation coupled with some
psychological intend to deceive or harm other people. And that's

(10:48):
also why people get more upset about certain kinds of
disinformation than than others, because we can all forgive people
for making honest mistakes and errors, but it's different when
someone's targeting you or actively trying to dupe you. But
it gets complex. Let me give you an example. Let's
say that the Chicago Tribune, which is you know otherwise,
I think the headline was doctor died shortly after receiving

(11:10):
the COVID vaccine. Now, these were two independent events, and
one might have nothing to do with the other. There's
an investigation that was ongoing. But are you misinforming people
by constructing a headline in that way? And so it's
not only that there's these sort of fringe outlets spamming
us with disinformation. Here's the question that they do that intentionally.

(11:33):
I don't know. I think these are the complex bigger
questions that we tried to study. The formulation of disinformation
equals misinformation plus deceptive intent. That resonates with me. That's
how I've tried to understand it. But what you just
shared about the Chicago Tribune example reminds me that even misinformation,

(11:54):
the innocent kind of version of disinformation, can be harmful,
and both versions breed a level of mistrust overall. Because
just my level of doubt is raised now about vaccines,
about the Chicago Tribune, about Facebook, because I just don't
know whether they intended it or not. Falseness is spreading

(12:16):
throughout the land. I think that's a great point because
even something that wasn't intentionally created to be harmful, let's
say it was a mistake, it can then be used
or weaponized by people who have a certain motivated or
political view. So if you don't like the vaccine, this
is now a great example for you to start sharing. See,
you know, a doctor died because they got the vaccine,
and so now it can be weaponized and used in

(12:37):
social media for a cause that maybe it wasn't intended
to serve. So something can start out as misinformation and
then become disinformation so our whole podcast is called how
to Citizen, and the premises that we all have a
role to play in shaping our society's the whole self
government thing. We believe in it were nerds for that reason.

(12:58):
I'm curious what you've learned about human behavior and decision
making in digital spaces that can make it hard for
us to self govern and participate in our society. You know,
I think one of the lessons that I've learned at
a basic level people to have a motivation to be accurate.
We do want to know what's going on in the world.

(13:18):
For most people, that's kind of a default baseline. But
then when you're put in situations that's basically thwart that
internal sort of radar that you have, things can get
pretty ugly. What happens when you go on social media
is that there's all sorts of different incentives that appeal
to people that have nothing to do with accuracy. What
are other powerful forces that influence our decision making? You

(13:42):
go online, you see some things that's not only been
shared by somebody you trust and know. That type of
information gets priority from people because we use it as
a heuristic. If information comes from somebody already know and
trust there's an implicit assumption that's been vetted and verified
by that person and they wouldn't share anything to do people.
But now it also has a thousand likes, it's been
shared fifty thousand times. That's a powerful indicator something important

(14:06):
is going on that you might want to share it
as well. And then there's the filter. So Facebook is
filtering things based on your prior click behavior and things
that you've looked at. And then you know, you're focused
with making a decision. What are you sharing online? What
are you paying attention to? You know, In one study,
will looked at millions and millions of posts on Twitter
and Facebook and what's the stuff that generates the most engagement.

(14:31):
It's posts that derogate the out group. So if you're
a liberal, the outgroup is a conservative. If you're conservative,
the outgroup is a liberal. And so we coded posts
for whether it was you know, positive or negative and
about liberals or conservatives, And across millions of posts, the
number one thing that got the most tractions is basically
trash talk about the other group. That is what it
gets engagement on social media. So when you come in

(14:58):
all accuracy motivated it and call them and honest, and
you get distracted by this incentive to start hating on people,
essentially because that's what's it you like, So that's what
gets promoted, that's what gets shared, that's what's the norm
on the platform. Then that's what influences your judgments and
behaviors much more so than other facts. It's kind of

(15:19):
like you're back in high school when you're on social media.
That social pressure is back, you know, and how do
we fix that is a big problem, and so we
certainly have some ideas. Whether or not social companies are
keen on them is another and I should say that
I do advise social media companies. Part of what they're
doing is on a micro level, they're trying to find misinformation,

(15:42):
they're trying to get fact checkers on board. They helped
them on how to debunk misinformation more effectively on their platform,
and these are kind of micro solutions. Right, there's a problem.
They try to fix it by getting more facts out there,
by you know, upgrading the way that they correct misinformation
on on their plat form. But at the end of
the day, I think what they're not thinking about is

(16:04):
that if you really want to change the incentives that
people face on social media. You're gonna have to rethink
the whole nature of the platform. What we want to
envision is a place where people have a motivation to
be accurate, to share factual information, to have constructive conversations,
to have positive conversations. The other thing that they'll say,

(16:25):
and I think this is very interesting to me, is
that we didn't design this platform to help people be
as accurate as they can be. That's not the purpose
of social media. We're not. They admit that, I admit
that we're not. We're not interested in getting everyone to
have the you know, scientific opinions and the truth driven.
We're not in the truth business. And they want people

(16:47):
to have fun, they have the conversations that they want
to have, even if they're they're spicy, and will admit
and say, look, that's not our purpose. Our purposes to
let people have all kinds of conversations, and we're not
we're not going to regulate necessarily what people say. I
think that's the issue. I think maybe to enhance a
better environment for everyone on social media, we have to
just fundamentally change the incentives, which means you're not going

(17:10):
to get as much engagement, and that's a difficult that's
a difficult ask if you really want to fix this problem,
we're gonna have less engagement means you're gonna make it.
Less money means we're gonna have a different kinds of incentives.
And I think that's just not a business decision they're
willing to make. When we come back, Sander and I
get into how playing games can enhance democracy. You can

(17:32):
do both at the same time. It's dope. Talk to
me about how someone engages in spreading wildly inaccurate information.
I'm one of these platforms. I'm talking microchips in my

(17:55):
maderna shot, I'm talking five G towers pumping out COVID,
even things that are counter factual to observed reality fly around.
What is the psychology of that kind of spread? Why
do people keep sharing it? Yeah? Why why do people
keep sharing it? One theory is kind of what I
call more of a generous take on the human decision

(18:19):
making condition that we find ourselves in. Right, It's called
the inattention account. If you think of the brain as
a computer being bombarded with information, and so our memories
are limited and our attention spans limited, you know, there's
too much going on and we're getting distracted. If only
there was a way to bring accurate information to your attention,

(18:41):
then the problem would be solved. Overwhelmed human is the
charitable interpretation. What's the other one? The other account is
a bit more nefarious, right. It's suggests that people are
actively biased, and that we share content because we want
to promote or identify with the kind of social groups
that we belong to. We have a political identity that
we want to make salient to people. It might be

(19:04):
the case that you share content not because you really
believe it. It's not that you're you don't believe climate
change or you don't believe in the vaccine. You're sharing
it because it reinforces the narrative of your group. It
makes the connections that you have with other people you
care about stronger. It helps you give you a sense
of purpose and agency and that you're belonging to a
movement um thinking about Q and on, for example, and

(19:27):
so it reinforces what we call a sense of social identities.
What you're describing sounds like gang colors and membership. You're
literally signaling your membership and the facts or lack thereof
don't matter nearly as much as you wave in that color.

(19:49):
It's very understandable, and I think I can even see
it from my own experience with this whole idea of
like when I see an article that confirms me, I'm like, yeah,
that's why am I knew Big Evil Corp? Was big
and evil? Then I'll share the hell out of that.
But then if I see some like wonderfully written defense

(20:09):
of like well why globalization has actually been on net part,
I'm like, whatever, that's be as somebody made that up.
That's this information because I don't even I actually don't
want to believe it because it challenges me, not intellectually,
but like identificationally or something like I don't I already
know who I am. I don't want to be someone different.
I've invested a lot in this identity, so I'm not

(20:31):
going to share something that challenges me. Then you just add,
you know, fuel to that fire when you put it
on a technology platform that has a financial incentive to
turn up those dials and hit both explanations of why
we do what we do. So you have this concept
of pre bunking that I find fascinating. Can you explain it? Yeah, absolutely.

(20:53):
Pre Bunking is the idea that rather than trying to
correct something after a fact, which was usually called debunking,
it's that you try to do it preemptively. M hmm.
But the idea here it goes further. It follows the
biomedical vaccination metaphor exactly. So, just as you inject people

(21:15):
with the weakened dose of the virus to trigger at
the production of antibodies and an attempt to help confer
immunity against future infection, turns out you can do the
same with misinformation. When you expose people to severely and
sufficiently weakened dose of the misinformation virus quote unquote or
the techniques that are used to spread misinformation, people can
build up cognitive or intellectual antibodies against them and become

(21:38):
more resistant. So we should at prebunk when it's possible.
You know, viruses have different incubation periods, misinformation pathogens have
different periods, and in the sense that even when you've
already been exposed, it can still be beneficial, but at
some point it's going to be too late. But pretty
bunk when you can yeah, that doesn't work. We can
do fact checking in real time and if it doesn't work.

(22:00):
We can still try to debunk and correct things after
the fact. I guess we haven't really talked about why
that's less effective. It very brief. It's less effective because
once you're exposed to a falsehood, it sits in your memory.
It makes friends with all the things you know. And
so we know from research and when people acknowledge a correction,
even when they acknowledge it, they continue to retrieve false
details from memory about the event. And I think it's

(22:22):
something very basic. If I don't know, I don't know
where you live, but if I told you that down
your street and went to one of the restaurants, and
you know, I got food poisoning was real bad. A
week later, I tell you, oh, look listen, actually it
wasn't that restaurant, it was another one. Every time you're
going to pass by that restaurant, you've gotta think food poisoning.
That's a difficult thing with corrections. It lingers in your
mind because this association has been made. And that's why

(22:43):
pre bunking is ideal. How did this idea emerge? Can
you put me in the room or the zone that
you or your team or wherever this idea came from.
How did it emerge? Yeah, well, I actually can tell
you that there was a psychologist in the sixties this
psychological warfare, or at least it's one place. His name
is Bill McGuire. He's no longer alive. Psychological warfare. He

(23:06):
developed some articles, very early articles, and something he called
the inoculation theory, which at the time was following the
biomedical iganization metaphor, but the use of this force as
an integral popular combat has now taken on new forms.
He was concerned during the Korean War. You know that
some of the prisoners award and there was a whole
paranoia about them being brainwashed at the time. Now we

(23:28):
know that there's other explanations for why some of these
soldiers voluntarily didn't choose to come back to the United States.
One example was racism, but at the time the predominant
narrative was that these soldiers were brainwashed. Here also, it
was a chance to see directly into the communist state
through the eyes of typical, average young American, and so
McGuire was thinking, well, is it possible to develop this

(23:52):
is the vaccine for brainwash? You know, how how would
you do that? The kind of key solution at the
time was the military and the White House stuff. They
were saying, Oh, the problem is American values aren't clear
enough to the people, And McGuire said, that's actually not
the issue. The issue is that the soldiers were not

(24:12):
prepared for the type of manipulation strategies that they would
be confronted with because the Chinese camps at the time,
they weren't violent necessarily, they said, welcome to the other side. Here,
we're going to educate you about what's really going on
with communism. It's not some evil thing, and we're not
going to necessarily harm you. We just want to re

(24:32):
educate you. That presented a lot of counter arguments to capitalism.
They had daily lectures and classes, and so what I
think what McGuire was trying to say was that they
really hadn't anticipated in an attack on the foundations of capitalism,
and a lot of them started they had no prior
defenses against They just assumed capitalism is good. They were
prepared for a war of military arms and weapons. They

(24:57):
were not prepared for a war of information ship exactly. Yeah,
you never got to the propaganda, the misinformation. He kind
of left this idea moved on to all other ideas.
It got buried for sixty years. So I was sitting
in the library one day and I came across one
of his articles and I was like, Wow, this is
gonna be so if we could develop this idea further
now in this context, is going to be so interesting.

(25:20):
So we kind of picked up where he left off
and started actually testing this in the context of misinformation,
and we thought, how could we bring this into the
twenty one century. And one of the things we did
is we started simulating a social media feed in a
in a kind of simulator machine together with gaming a
company and a Vida literacy company that we teamed up
with and a bunch of programmers, big team, and then

(25:40):
we decided to produce some real world interventions where people
can enter what we called the disinformation Simulator that would
be exposed to these weakend doses of the key techniques
that are being used to deceive us online. And then
we found that over time people can build up, you know, immunity.
This is so so perfect because simulations are used in

(26:05):
trainings of all kinds. You know, pilots have flight simulators
and you know, infantry have the first person shooters simulator,
and we use games to teach. So you built this simulator,
this game to extend McGuire's thinking about inoculation theory into
a more modern day practice, not against you know, Chinese

(26:25):
political propaganda, but against social media distributed propaganda. What a
fascinating path from the sixties to now. One of the
quotes that I've pulled out during this process was from
Harry Potter Books professor Several Snake, who said, we want
to find the dark arts, then our defenses must be

(26:46):
as flexible and inventive as the arts that we seek
to undo. And I think our our common realization was
the dark arts of manipulation are evolving sciences as ayn
factor for a lot of people. We got to go
out of the lab produce some things that are entertaining
and fun for people so that we can actually get
this out and test it in the real world and

(27:08):
make it fun and entertaining and not you know, so
people don't get the feeling that they're attending a lecture,
but they're actually playing a part and generating their own antibodies.
I've played the game that you and your team have
created that's built around this inoculation theory, and I gotta

(27:30):
tell you, I'm very impressed. Like it revealed. It was
like lifting a veil on the matrix. I was like, oh,
that's how this works. Oh, I was invested in creating disinformation.
You kind of you simulated me as a chaos monkey,
as an agent of chaos visited upon this fictional place.
Can you explain the game Harmony Square and how it

(27:54):
works to put into practice this inoculation theory we've been
talking about, Yeah, you know, it's great. So we have
a couple of interventions, and Harmony Square was one that
focused on disinformation during elections and political sort of disinformation.
We also have bad News, which is our general simulator,
which is not specific to a particular domain. It's a

(28:15):
sort of very broad. But Harmony Square came about and
because there was an interest in inoculating people against foreign
influence techniques that are being used to medal with democracies
and elections and of course such a big topic that
we wanted to do a specialized version of some of
the more general simulators that we've build. Congratulations, you are

(28:40):
hired and welcome to your first day as our new
Chief Disinformation Office. Harmony Square is a green and pleasant place.
It's famous for its living statue, it's majestic Point Swap
and it's a new Pineacle Peaks festival. We talked that
basic idea of Harmony Square, which is the last democracy
on Earth at Depressing. That's there you enter into a

(29:04):
peaceful town. The contents all fictional, and it's supposed to
be a bit ludicrous, right that there's this fictional town
and something very innocuous happens. Using that kind of narrative,
we try to inoculate people against some of these techniques
that are used to polarize people. M hm m m. Yeah.

(29:35):
If you goes here an illusionist or a magic show
the first time, you might be duped. And there's really
two ways to fix that. Once I'm going to give
you a blueprint of how the trick works, which is
kind of like a factual sort of treatment. Or I
could let you step into the shoes of the illusionist
for a little while so you can discover the trick
on your own, and that way you're never going to
be duped by it again. Chaos is what counts. Let's

(30:06):
create another alter ego account and pretend to we're on
the other side of this fight. I am totally ramped up.
I am invested in this egg. On the other side
as well, Oh, we're definitely using bots deploy the bots.

(30:34):
I love that. Man. When I tell you like I
enjoyed it. You made me into a monster and I
loved it. That's that's how effective the game was. I
played a lot of games I've overseen, like hackathons with
creative activists and comedians and stuff before, so I thought
I knew what I was getting into. And by the end,

(30:54):
I was like, I'm going to destroy this town, like
we won't even remember it existed. And so you have
insensivized really devastating behavior. I was rewarded for it. You
track the amount of followers you get after each wave
of these campaigns and do you want to escalate or
go home? Definitely want to escalate, right. So I look

(31:17):
back after this and I'm like, oh, man, can you
connect the dots from this game back to the real world?
And how an experience like this, whether it's this game
or some of your other projects, helps me interface with
and process my actual social media feed better. How are
my defenses more activated against the real life misinformation and disinformation? Yeah?

(31:41):
Absolutely so? As you said, there's a bit of a
shock value to the game, because precisely, one of the
core elements of inoculation theory is that people need to
experience a sense of threat to motivate themselves to want
to defend themselves against misinformation attacks. We need to activate
your antibody production, and so we need to get people,
you know, into the mode. And as you said, of
their elections going on during the game, and there's a

(32:02):
newscaster and you can see the approval ratings live and
as you cast your chaos, they're they're affected. And there's
this candidate and you have a Samarican pain about them.
How are you going to ruin? Actually plugs unopposed, run
message family and friends, or create a fake news site.
All right, here's another option, Ashley plugs disgusting chat messages leaked.
I'm bleeping hate bears. This is the kind of chaos.

(32:23):
We need to post this and then oh, it's like
a sloth photo. What the words please, it's meant to
be a bit amusing. What we do at the end
of all of our interventions is we evaluated empirically, so
at the beginning of the game, and I'm not sure
if you participated in I did. I did everything I
was asked to do. We give people some stimulated social

(32:45):
media headlines, and we asked them how reliable they think
that are, how confident they are un their judgment, but
they would share it on social media and things like that.
And the types of headlines that we give kind of
reflect what's going down social media. So I'll give you
an example. Basically as people protesting saying and Father's Day
and other people you know and so and so. It's
this it is an issue that's getting blown up, potentially

(33:07):
by by nefarious actors because they wanted to sow divisions,
and so we want to know, I have people become
more tuned to this strategy of of for example, in
this case polarization. Another was a news article tweet about
a news article that I think it was a father
and his and his son and they went out hunting
and shot themselves or something, and so many commented, oh,
one point five magabilities less in the world. It's this

(33:29):
type of deeply polarizing sort of debate that we wanted
to address. And what we found is that people are
better able to recognize these strategies in the sense that
they found these posts less reliable. After playing a game,
they're less likely to say that they'll share this type
of content on social media once people leave the game.

(33:51):
We've we started to follow up with them a week
after week and don't worry, we get ethics approval from
this from the university. But we we sort of attacked
people with misinformation week week after week, and so at
tax House nefarious. But we basically present people with social
media posts that are misinformation and we ask you know,
the same questions. And what we found is that actually
for a psychological vaccine, it lasts pretty long, for about

(34:12):
two months. The antibodies are still there, you know, after
two months. It helps when you boost people in between.
So we found that there's a decay like what the
Fiser vaccine, and you need a booster otherwise it wears off.
There's too many distracts we call interference going on in
the world that makes people forget and get less motivated.
But you can boost people in between by re engaging them.

(34:33):
One of our interventions the bad News, which is the
sort of the main simulator when viral on Reddit, and
they can actually crashed our service, so they call it
the Reddit Hug of Death. The Red Hug of Death.
Yet we started, like, we started scraping what the redditors
were talking about, for example, and it was really interesting
to learn about, you know, people sharing their experience about

(34:53):
the game and what they've learned, and they started getting
us thinking about her immunity. And wait, maybe maybe people
are airing the inoculation with each other on social media,
And wouldn't that be cool that if people talked about
what they've learned share with others, so that even people
who didn't directly play the game can benefit from the
vaccine sort of speak. And so that's kind of what
we're working on now. So I played one of your games,

(35:17):
and I'm a good person, Sander. I am open minded.
I vote for the right people. But there's other people
out there, Sander, who are not the best, and they
spread lies and deceptive information all the time. Are they
playing your game? To Sander? Is the other side playing
this game? Are my Q and on brethren playing in

(35:39):
your game? Well? It's interesting. I don't think that die
hard conspiracy theorists are playing our game, but we aren't
thinking about some ways of trying to reach a broader
audience in terms of the inoculation and getting it, you know,
getting it scaled to people who might not voluntarily, who
might not volunteer to come in and sort of learn

(35:59):
more about this stuff. The epidemiological metaphor of vaccination and
inoculation is so clear as a strength to me, but
it's something that has some limits. So we're in a
real epidemiological challenge right now with COVID nineteen and vaccines
are a tool, but they're not the only tool, right
and with any infectious disease, we don't just rely on

(36:22):
people to inoculate themselves against the threat. We have public
health agencies, we have government policies, companies institute barriers. So yeah,
big question, but simplified, what else can we do if
not every person on Earth plays your game to still
get a handle on the misinformation disinformation challenge? I think

(36:46):
the uncomfortable spot that we're all in, especially as a scientist,
is that it seems unrealistic. What else can we do
that they're willing to accept? And for us, YouTube actually
doesn't work directly with outside scientists. It's actually it's it's
very difficult to implement an evidence based sort of freebunk
on their platform. So I gottapplause you there earlier in

(37:09):
our conversation, you um acknowledge, and you know, these platforms acknowledge.
They're not in the business of accuracy or truth right there,
in the business of engagement, their business of entertaining conversation,
probably user growth. But you just said about YouTube doesn't
work with outside researchers directly, that sounds to me like

(37:30):
they don't want to know the truth. They're actively avoiding
understanding the impact of their platforms on us. What's your
read on that decision. I think there's probably some truth
to that. Probably doesn't only apply to YouTube, but to
most social media companies, because the fact of the matter
is we can say what we want to them, and
we say a lot to them, and we have meetings

(37:50):
with them regularly, and they listen to us, and they
do respect to us, and and they take our evidence.
But you know, they say they have their own internal
evidence that they don't always want people to extract to
ee weight externally. So what can we say. It's like, okay,
the way they hold on they have their own facts,
they have their they have alternative facts, they have their
own alternative facts. This is terrible, but it's tricky, you know,

(38:11):
because they say, oh, in your experiment, you're just simulating Facebook.
We can actually see what's happening on Facebook, and it's like, yeah,
but if you don't want to share it, then we're
not getting anywhere. But we have very little information about
what is actually going on in these platforms the studies
that we do from social media because we get limited
access to scrape millions of posts, but it's it's only
a snapshot really of what's what's going on. And they're

(38:33):
very hesitant to work with scientists and needs a long
time to build up relationships. We're getting there, but it's difficult.
I just want to say, agree with you. To us,
it seems like a win where you can get them
to implement an evidence based solution, even if it's a
minor one. And I will say the people that we
work with and the research teams at these companies there
are really motivated and they really want to fix the problems.

(38:53):
I think the issue is with the high level policy
executive people who just shoot down the sort of more
artical and solutions that we need. I think that the
problem is with the people a higher up, not necessarily
the research teams or we're going out making connections with
researchers reading our papers, wanting to fund our research, want
to implement a solutions, and then they go to their
bosses and they say, interesting, we'll think a much. We'll

(39:16):
look congratulations for having any level of dialogue and partnership
with these large organizations. So we're going in the right direction.
Whether we're going far enough, fast enough, we should argue
about that. On social media, I'd love to know how
you think about the impact you've had, whether it's in

(39:36):
the partnership world with these companies, whether it's working with governments,
or whether it's just you know, individuals from a Reddit
or some other social share coming across some of these games.
I mean with Facebook weekly, I take time, you know,
how did my schedule every week to try to Yeah,
and that's incredible. How much furniture do you throw during
these well, you know, and it is and and there.

(39:59):
The team is really good, but the decisions that are
ultimately made it's slow, and it's very slow process. But
we're making progress. But no, it's not. It's not going
fast enough. We need more radical changes in solutions. We
we try to team up with the organizations that are
impactful in the area that we're working, whether it's a
Department of Homeland Security who can distribute this to all
political parties and so on, or with our COVID nineteen

(40:20):
game called go Viral. We got some support from the
World Health Organization and the United Nations, and you know,
they have volunteers that can target this intervention at vulnerable
audiences and really help scale it across millions of people.
But there's billions of people in the world, not just millions.
So I think, what can people do? Here's my general
philosophy for society. I think we need a firewall system

(40:41):
to mitigate the post truth sort of biased that's that's
creeping in. And so this firewall system, or a multi
layer defense system should start with the pre bunking. So
we should all prebunk the w HL, the social media companies,
even regular companies. And then at the same time we
have to radically reinvent the incentive structure of social media.
So nothing big, nothing big that I'm floating here, that's all.

(41:04):
That's all. So so we call this show how to Citizen.
Whe We think of citizen as a verb, not a
noun or legal status, so much as a posture of
participation in society. What's your view on what citizening means
to you. I think being a good citizen means not

(41:26):
only maintaining a healthy information diet or yourself, but also
helping other people to discertain fact from fiction in their lives.
And I think that's how I see morale, is that
it's not just about me, it's also about helping my
fellow citizen not get duped. This is a refreshing take
on a challenge that so many of us just feel

(41:48):
hopeless about. So thank you for another perspective on that.
I'm excited that you've built something that's fun and terrifyingly
effective at the same time. That's hard trick to pull off,
So thank you, Sander for the time. I look forward
to a more sane and healthy information environment for us
all to have it. Thanks so much for having me on.

(42:25):
We've all been there. I know I have. Just kicking
it on the Internet and some jerk shows up spreading infuriating,
lee incorrect garbage. So we do what any good citizen
is supposed to do. We dump data, we fire all facts.
We counter that misinformation with real information to prove that

(42:46):
jerk wrong. But Sander wants us to reimagine and reframe
the way we approach misinformation. We can't hit people over
the head with facts. Whack a mole is in a
active games like harmony Square. On the other hand, they
teach us some of the dark arts of misdirection and illusion,

(43:09):
and like peeking behind the curtain and seeing the great
and powerful odds for the first time. Once you see
him and his dirty bag of tricks, he loses some
of his power. So stay safe, stay alert. Think twice
before you hit that share button, but honestly, think also

(43:30):
about who even wants you to hit the button in
the first place, and what they have to gain from it.
As we check in with ourselves about the content we consume.
Next time, we get a lesson on tech nutrition, because
machines gotta eat too. Bias in, bias out like garbage in,
garbage out. You feed this machine something, the machine is

(43:51):
going to look exactly like what you fed it. You
are what you eat. By now you know we're committed
to giving you things to do beyond listening to our episodes.
And I knew how the citizen dot com website, we've
got every episode transcripts, links to the guests, but most importantly,
we have things you can do to actually practice citizening.

(44:12):
So in that spirit, for this episode, here's some things
you can do. Point your browser over to inoculation dot science.
That's right, there's a dot science domain name. Get your
science on. Head on over to an oculation dot science
one end, and play the set of games that they've built.
In addition to breaking Harmony Square, which you heard me

(44:34):
playing and acting the damn fool as I did, so,
they've got games to help you limit the harm of
fake news and COVID misinformation. After you've played some of
the games and watch some of the videos, reflect on
how they made you feel. Are there online experiences you've
had that make more sense once you consider you might
have been intentionally manipulated. How does that feel? I suspect

(44:56):
it makes you mad, but also it might make you
feel more empowered. And do you think these games might
affect how you engage online in the future. Finally, share
these games with the people you care about. So many
of us have folks in our lives, and we don't
want to waste hours and hours convincing them of something
that's so obviously false when we take real information into account. Look,

(45:18):
friends don't let friends spread misinformation. It's kind of as
simple as that. I don't expect you to memorize all this.
Everything I've said is a version of it in the
show notes, in the podcast app you're listening on right now,
and we've got all these links over at how to
citizen dot com. You can also engage with us on
I G on Zuckerberg's property. We are at how to

(45:40):
Citizen where you can share and learn from other people
who own this journey with us, including me. That's all
I got for now. Peace. How to Citizen with Barriton
Day is a production of I Heart Radio Podcasts and
dust Light Productions. Are Ecutive producers are Me Barry to

(46:02):
Day Thurston, Elizabeth Stewart, and Misha Usa. Our senior producer
is Tamika Adams, our producer is Ali Kilts, and our
assistant producer is Sam Paulson. Stephanie Cohne is our editor,
Valentino Rivera is our senior engineer, and Matthew Lai as
our apprentice. Special thanks to Sam Paulson for creating the
chip tune arrangement of the how the Citizen theme and
the Harmony Square inspired tunes to accompany my game play.

(46:25):
This episode was produced in sound designed by Tamika Adams
with additional help from Sam Paulson. Additional production help from
Arwin Knicks. Special thanks to Joel Smith from my Heart
Radio and Rachel Garcia at Dusklight Productions.
Advertise With Us

Popular Podcasts

1. The Podium

1. The Podium

The Podium: An NBC Olympic and Paralympic podcast. Join us for insider coverage during the intense competition at the 2024 Paris Olympic and Paralympic Games. In the run-up to the Opening Ceremony, we’ll bring you deep into the stories and events that have you know and those you'll be hard-pressed to forget.

2. In The Village

2. In The Village

In The Village will take you into the most exclusive areas of the 2024 Paris Olympic Games to explore the daily life of athletes, complete with all the funny, mundane and unexpected things you learn off the field of play. Join Elizabeth Beisel as she sits down with Olympians each day in Paris.

3. iHeartOlympics: The Latest

3. iHeartOlympics: The Latest

Listen to the latest news from the 2024 Olympics.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.