Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Jean Gomes (00:03):
What's the impact of
misinformation on your future
leadership agenda? Perhaps it'smore than just the
responsibility of your financeand IT departments. One in five
people got scammed last year inthe UK. 9 million people. The
annual cost worldwide of onlinefraud of consumers is put at
somewhere around ten billion butit's hard to know exactly,
(00:26):
because many of us don't reportbecause we're ashamed or time
poor. But we're not just beingfooled out of our money. We're
also being robbed of our abilityto think for ourselves. Upwards
of 90% of people have fallen forfake news in the past year. The
rising flood of MIS anddisinformation affects our
political beliefs, ourrelationships with our friends,
(00:49):
neighbours and communities, ourmental health and our sense of
control. It's drivingpolarisation that isn't just
confined to online spaces, butis creating rising tensions and
politics in our workplace, asmore of what used to be personal
becomes political, and then amatter of what affects us at
work, leaders must continue totrack the shifting agenda of
(01:11):
these new responsibilities. Inthis show, we talk to one of the
world's leading researchers onmisinformation, how it spreads
in online networks, and whatbehavioural interventions we can
design to counter it at scale,tune In for a crucial
conversation on The EvolvingLeader.
Scott Allender (01:49):
Hey, folks,
welcome to The Evolving Leader,
a show born from the belief thatwe need deeper, more
accountable, more expansive,more comprehensive. What else
Jean? What else do we need?
Jean Gomes (02:00):
More eloquent?
Scott Allender (02:01):
More eloquent,
most definitely more eloquent,
and more human leadership toconfront the world's biggest
challenges. I'm Scott Allender
Jean Gomes (02:09):
and I'm John Gomes.
Scott Allender (02:10):
How are you
feeling today? Mr. Gomes,
Jean Gomes (02:12):
I am feeling two
things. I'm feeling glad that
I'm at the back end of flu. I'mcoming out of that. I'm feeling
more more like myself, and I'malso feeling a real urge to get
some thinking from our guesttoday, because I think we all
need this, given what's beenhappening in recent months and
(02:33):
years. So I'm very excited tokind of be in the space to think
differently about how I think.
How are you feeling?
Scott Allender (02:43):
I'm feeling a
mix of things. I'm feeling
grateful for my toasty houseright now as it snows outside
and my kids are home for theirsecond snow day and a matter of
only being back from holidaysfor a week. But I'm also feeling
a lot of sadness and sort ofdeep concern for many family
members and friends andneighbours from my old stomping
(03:03):
grounds and the Los Angelesarea, and watching all of that
unfold has been horrific, sobringing a mix of emotions with
me today and quite a bit ofexcitement as well, because
we've been looking forward tothis conversation for quite some
Time. Because today, we'rejoined by soder Vander Linden
Saunder is a professor of socialpsychology and society at the
(03:26):
Department of Psychology at theUniversity of Cambridge, and
director of the Cambridge socialdecision making lab. Before
coming to Cambridge, he heldposts at Princeton and Yale
University. His researchinterests centre around the
psychology of human judgementand decision making. In
particular, he is interested inthe social influence and
persuasion process, and howpeople are influenced by
(03:49):
misinformation and gainresistance to persuasion through
psychological inoculation. He isalso interested in the
psychology of conspiracytheories, extremism and
radicalization. Not sure wherehe's seen that in the world, but
we'll be interested to talk tohim about that media effects
social networks, polarisationand the emergence of social
(04:09):
norms, reasoning about evidenceand public understanding of risk
and uncertainty. He haspublished around 175
papers and is ranked among thetop 1% of all social scientists
worldwide, and we loved hismultiple award winning book,
fool proof, why misinformationinfects our minds and how to
(04:29):
build immunity, which we'regonna dive deep into today.
Sander, welcome to The EvolvingLeader.
Sander van der Linden (04:36):
Thanks so
much for having me on.
Jean Gomes (04:37):
Sanda, welcome to
the show. How are you feeling
today?
Sander van der Linden (04:41):
I'm good,
I'm good. You know also mix of
feelings about everything that'sgoing on in in the world. You
know, from the wildfires, I havetwo colleagues whose whose house
is burned down as well in in LAto being depressed about meta
shutting down its fact checkingprogramme. Right to, you know,
(05:03):
being warm inside while it'scold outside, and, you know,
being back from holidays andready to do some work again.
Scott Allender (05:14):
Well, we're so
excited to talk to you. And I
love the idea when I when Iheard this, that you're called
the Defence Against the DarkArts teacher at Cambridge. And
before we get into the depth ofyour work, I'm interested in
knowing when was the last timeyou were fooled by
misinformation.
Sander van der Linden (05:33):
Well, you
know it happens more than, than
than people would think. Justyesterday, I had a meeting with
my my students, and I'morganising a conference the
Cambridge disinformation Summit.
This is not an advertisement forit, but, but
one of my students said, Ohyeah, oh yeah, the dates are
(05:55):
April 12 to 16th. When, whensomebody asked us, and I was
like, Oh yeah, that's, that'sright. And then I got home and I
looked, and those were not thedates of the conference, the,
you know, it's the last week ofApril. But for some reason,
there was this kind of blindtrust in an otherwise, you know,
very smart student. So I used acue of this person must be
(06:18):
smart. Hence, you know whatthey're saying makes sense,
which most of the time, is agood heuristic, but can fool us
some of the time when,especially when people speak
about things that may not be intheir expertise, and, of course,
organising my own conferenceswithin my expertise, but I had
forgotten the the actual date.
So I can't blame my student. Ican only blame myself. But there
you go. I was fooled bymisinformation,
Scott Allender (06:41):
misinformation
going around. About the
disinformation conference. Aboutthe disinformation
Sander van der Lind (06:46):
conference,
yeah, and, you know, I think
this was misinformation because,I mean, it wasn't intentional.
She wasn't trying to deceiveanyone, right? It was just an
error. So, you know, fairlyinnocent, innocuous
misinformation, but just anillustration of how our brains
are quick to jump to conclusionsabout things that kind of half
match what we think is trueabout the world. And, you know,
(07:08):
it happens on a on a frequentbasis. And there's some people
who think that we're, you know,we're all so smart and immune to
everything, but from what I knowbased based on my research in my
daily experience, I'm full ofall the time.
Jean Gomes (07:24):
Your passion to
understand and debunk Fake news
is tied to your sense ofidentity. I think. Can we start
with how your childhoodexperiences in the 90s in the
Netherlands shaped where you aretoday? Yeah. So,
Sander van der Linden (07:38):
you know,
I grew up fairly you know, we
weren't overtly religious, but,you know, my parents were
Jewish, and there's a sort oftradition of talking about what
would happen in in theHolocaust. And you know, when
you're little, it's hard, it'shard to wrap your head around
this. You know, your familytells you like you ask, okay,
(07:59):
where are we? You know, where ismy cousins? Where are my
nephews? You know, why don't wehave more family members? And
it's kind of strange when yourparents tell you, Well, you
know, most of them were executedduring World War Two. And so,
you know, initially you justkind of accept that as an
explanation. But, you know, whenI got older, I started thinking
about, but, you know, why dothese things happen? You know,
(08:21):
how do people act on informationthat isn't consistent with with
facts or based on conspiracytheories? How does that motivate
people to do, to do bad things?
And so that that didn't directlylead me to then investigate
misinformation, but it actuallysparked my interest in
discovering the field ofpsychology, of why do people do
what they do? How does the brainwork? How does information
(08:42):
impact us? And so that, youknow, I think was, was something
that did get me interested intrying to understand human
psychology. And then eventuallyI got back to the study of
propaganda and persuasion indifferent contexts. But
certainly, yeah, certainly,there was a personal element to
it, in that, you know, I wasinterested in, in, in how
(09:05):
atrocities happen, and what rolepropaganda plays. Tell,
Scott Allender (09:13):
tell us more
about that journey of discovery.
I'm curious to know kind of, youknow, as you started to dive
into the psychology of it, whatwas most surprising for you,
Sander van der Linden (09:25):
actually,
what was surprising to me is
that it's, it's actually veryhard to study. So there's a lot
of sort of historicaldocumentation about Nazi
propaganda. How you know whatwent into it, how they designed
it, things like the big lie,which has a lot of relevance
(09:48):
today, where it's really aboutthe idea that if you create a
lie that's so outrageous, it'sactually more likely to work
because people. Imagine that youwould just come up with such an
outrageous lie, and that themore you repeat it, the more
(10:09):
common it becomes for people. Soyou start out with an outrageous
lie which seems kind of crazy,but then over time, it becomes
normalised. And so whatsurprised me is that some of
these propaganda tacticsactually seems seem to be
repeated throughout history andin a different context. And I
(10:31):
started thinking about the factthat, you know, it wasn't just,
you know, World War Two. I mean,I certainly did some of the
groundwork in terms of figuringout what's effective. But what
surprised me is actually thescale at which this is
replicated across differenttypes of Conflicts and
situations. And I startedthinking, well, there's actually
(10:52):
this playbook to propaganda thatwe should be uncovering, and the
research around is actually verydifficult, both for ethical
reasons. I mean, you know,nobody's really done
experimental research on Nazipropaganda. I mean, you know,
there's just no ethics committeeat a university that would
approve an experiment where youexpose people to Nazi propaganda
and see if they go out andattack Jews or or LGBT members
(11:15):
of LGBT communities, right? ThatThat just doesn't happen. So
it's all, it's all a bitindirect, in terms of the Mecca,
you know, trying to uncover themechanisms by which it happened.
So I became fascinated. I waskind of surprised by the fact
that it's not something that youcan easily study in a controlled
and, you know, experimentalsetting. It takes a lot of
interdisciplinary, you know,historians and and psychologists
(11:39):
and and simulated studies andeconomists actually doing very
interesting work. There was astudy thing, was a great study,
which also cover in the book,which looked at cohort, birth
cohorts. And I looked at, youknow, people who were born
before the Nazi regime, duringand after, and of course, the
(12:02):
Nazi years includes years ofindoctrination in schools. There
were pamphlets everywhere. Youknow, it's very extensive
programme, and they did find,and this is maybe to come back
to Scott's point about What's sosurprising is that, you know, of
course, you know, theexpectation was that people who
were who grew up during the Naziregime would have more anti
semitic views than people whodidn't. But what was so
(12:24):
surprising is that, you know,even, you know, basically this
was done, you know, 2000 youknow, 2006 2010 so not too long
ago, we're talking, you know,1015, years ago. So even people
who are currently living inGermany still have more anti
(12:45):
semitic attitudes if they were,you know, part of a birth cohort
that grew up during the Naziregime than people who didn't.
And so they have these surveys,the National German Social
Survey, and they ask, you know,are Jews behind world events?
And you know, should Jews beallowed to, you know, walk
freely in German society, andyou know things like that. And
(13:06):
you know people who, you knowregions, both regions where
people voted for Nazi partiesand who were more exposed to
propaganda were still more antisemitic to this present day. And
it was kind of surprising to me,because I kind of thought it was
all over, you know, like thishappened during World War Two
and and you would find an effectduring that period, but not now,
or, you know, 10 years ago, butthat's what they found. And I
(13:29):
think that's that's talks reallyabout the the longevity and the
persistence of being exposed topropaganda. And I worry that,
you know, when we live in a timewhere we're just bombarded with
lies and propaganda and yeah,you know, sometimes people say,
oh, you know, it's not so bad,or, you know, it's, you know,
it's not going to affect us. ButI think what we know from
(13:49):
propaganda is that, sure, yeah,people don't wake up one day and
decide to start putting peoplein the gas chambers. That's not
how it works. It's years ofexposure to first, subtle and
more implicit propaganda. Andthen, you know, you soften
people up, you get them ready.
And then, you know, becomes moreactive, and then you start
bombarding people with the withthe more dangerous ideas. And
(14:10):
throughout that whole process,it can actually leave an imprint
on people many, many decadeslater. And yeah, that was that
was surprising to me in terms ofthe outcome of that study. And,
of course, crucially, I shouldsay that, you know, we're not
empty vessels. You know, we'renot exposed to propaganda. And
all of the sudden, you know, webecome, you know, you get green
(14:32):
lights and you become, you know,green eyes. You know, a little,
a little green light in youreyes, and you become a
manipulated robot. I mean,that's not how it works, but,
but you know, because areas thatwere already susceptible to this
kind of rhetoric before the Naziregime, that's why the
propaganda was most successful,right? So they they had, you
have to find people who aresusceptible to your message in
(14:53):
order for propaganda to work.
But it was surprising that italso worked. And in areas where
people didn't have that priorsusceptibility, you know, worked
less well. Obviously there wasmore resistance. But yeah, it's
kind of worrying to think aboutit that that, you know, you
don't even need that initialseat necessarily, but certainly
(15:14):
helps. And that was also theconclusion from goebbel who who
ran the, you know, he was theminister of propaganda during
the Nazi regime, who said that,you know, it works best amongst
people who were alreadycongenial to to the message. And
yeah.
Jean Gomes (15:34):
So, I mean, we might
live in a world where we think
the underlying assumptions thatwe've kind of evolved past kind
of likelihood to be fooled bythese kind of messages. What do
you think is different todayabout how our brains have
adapted to the online space?
Yeah,
Sander van der Linden (15:53):
I think
what's different today? And
yeah, sure, some people saylike, oh, you know, but we've
always had propaganda, right? Wecould go back further than World
War Two, we go back to mediaevaltimes where, you know, people
were stoning women and callingthem witches because they had,
you know, ideas of independenceand autonomy and and, and, you
(16:13):
know, they were masteringcertain skills and crafts. And,
you know, we label them witchesand burn them at the stake. So
that's, you know, that's,that's, that's been happening
for, for a long time. And sopeople ask, okay, what's, what's
different now? But I do thinktechnology has fundamentally
shaped our relationship withinformation. I think that's, you
(16:33):
know, that's, that's what'sdifferent, but also the speed
and velocity at which ittravels. I think in the book, I
actually calculated how long itwould take in the Roman Empire
to spread a message by, youknow, horse and waggon. And it
would take, you know, a week,even if you had the fastest
cart, you know, it might take aweek to get your message out
there. And then it would spreadin a village. And so I kind of
(16:55):
go through that diffusionprocess of, you know, back in
the olden days. And of course.
Now it all happens within thesplit of a second. You know, you
send a message on WhatsApp to agroup of, you know, 100 people
who forward it to another groupof 100 people. And before you
know it, you know it'sfactorial, right? So before you
know it, millions of people havebeen exposed within this, you
know, couple seconds, and sameon social media. I mean, it's
(17:17):
the networks are different, andso the structures are different,
and how it, how it, howinformation diffuses, but we can
reach, you know, hundreds ofmillions of people, you know,
within within seconds, andpeople just, oh, you know, we
didn't necessarily evolve beingbombarded by information all the
time. That's also why we rely onrules of thumb, right? The brain
(17:40):
is always trying to look forways for shortcuts to try to
manage information load. And youknow, it's a difficult task for
people to now log on and beexposed to so much information.
And it's not just social media,right? It's cable news, it's
podcast, it's radio, it's print,it's friends and family, it's
social media sites, and more andmore social media sites are
(18:02):
popping up, and so we're tryingto juggle all this information,
and we know that if you stressthe brain out, even on a simple
task, if I give you a littlememory task, you'll get less
accurate at doing other tasks,because cognitively, our
resources are divided. We onlyhave so much, and we have to be
biassed in some ways, and biasin terms of screening out other
things. And so I think theinformation overload makes that
(18:25):
process works in that we becomemore biassed, we start being
more selective, screening outmore and by selective, I mean,
you know, it's easier to processinformation. Sorry that you've
heard that you've heard before,that feels familiar, that sounds
familiar, that aligns with whatyou want to be true about the
world. So all of thoseundesirable kind of biases that
do have a function but creep inand lead people lead people
(18:48):
astray. There's a relationshipbetween the online and the
offline world. Sometimes peoplethink, well, it's okay, it's all
just virtual conversations. Butthat's not true. You know, as
we've seen in the UK, you know,what starts as a fake news story
about, let's say, you know, atragedy. This was people who
(19:09):
don't know there was anassailant who stabbed a number
of young girls, right? That thatdance? I think it was a, you
know, it was, it was a danceclass, and there was fake news
about who this person was. Therewas some fake story that it was
a Muslim asylum seeker who cameto the UK by boat. I mean, this
(19:29):
was all total nonsense, but itwas the catalyst for, you know,
major offline riots that thatoccurred. Buildings were, you
know, put on fire, people gotviolent. And you see that. You
see that elsewhere. To takeJanuary 6 in the US, right,
capital riots, you know, falseconspiracy theories that start
online about stolen elections,actually, then can can lead to
(19:53):
to violence, and it doesn'talways, you know. There were
this election, there were leftwing conspiracy theories about
the election being fraudulent,and that didn't lead to violence
in that case, but sometimes itdoes. There's a there's a
probability there. There's aconcept called stochastic
(20:15):
terrorism, or stochasticviolence. And so you know, all
of that online hostility,toxicity raises the probability
of of stochastic violence. Andso, you know, that's, that's
new, you know, it's notsomething we've dealt with
before. And then the last thingI'll say is, you know, with the
role of AI and deep fakes, microtargeting, right? What's,
(20:37):
what's, you know, Hitler, Hitlerdidn't have targeted ads, right?
Imagine if he had access to, youknow, voters, digital
footprints, and he could targetthem specifically based on their
prior beliefs, because that'swhat they were looking for,
right, people who were congenialto their message. What if they
had such a fine grainedmachinery that actually allows
(20:57):
them to target a single personwith their specific preferences.
And so that's the environmentthat we find ourselves in now
that there's a technology thatcan do that. And, you know,
couple years ago, we people wereright, you know, there were
Russian farms, troll farms, whowere writing these messages by
hand or on the computer, right?
Oh, oh, this person, you know,is watching these YouTube
(21:19):
videos. So let's craft a messagethat plays into, you know, the
type of content that they'rewatching. That's labour
intensive. It's costly. Now youcan have aI write very 100 1000s
of variations on the messagewithin a split of a second,
right? You can just automate it.
Deep fakes. People are not usedto that. People are very bad at
it. You know, research shows,you know, people struggle with
differentiating synthetic fromfrom authentic content online.
(21:40):
You know, luckily, we're notinundated by by hyper realistic,
deep fakes yet, but thetechnology is there, and we know
that people are bad atdiscerning regulations trying to
catch up. So, yeah, I would saythere's a lot law going on in
the online informationenvironment that makes our
situation very different fromwhat it was before. And the last
(22:02):
thing I'll say is, is thatthere's a, there's been a big
change, I think, in the newsmodel too. So you know, when you
go from the printing press andthe first, you know, yellow
journalism, the first tabloids,right, which were already a bit
grey area content, sometimeswith some hoax stories here and
there that were drooping peopleto cable news, which also
(22:25):
lowered the standard a bit ofwhat you can say. But at the end
of the day, news, traditionalnews, Legacy news, has
producers, editors, factcheckers, there's guard rails,
there's lots of layers. Youcan't just go and they're
regulators. You know, in the UK,you have of come, for example,
in the US, you have, you knowindependent watchdogs, their
procedures. You know falseadvertising. There's laws around
(22:48):
that. So there's a there's along history that, you know, the
evolution of TV took a long timeright before there was full
penetration of television in thein the general population, there
was a lot of time forlegislation, evolve, try things
out, test things out, see what'sgoing on with social media. It
all happened within the courseof just a few years, right? It
(23:09):
exploded, and I think we'restill trying to catch up. You
know, whole different debatewith with Jonathan heights book
and people going back and forthover whether social media is bad
for our mental health or good,is it dangerous for teenagers
and teens and and so on. Andthose are, you know, difficult
debates, and the evidence isdifficult to comprehend because
there's so much exposure and solittle research in terms of what
(23:34):
we actually know, and trying tocatch up to to that is, is
difficult. And I think that's,that's partly what's going on
with with this debate as well,you know? So, so what I would
say is that a on YouTube, youknow, there's no guard rails,
there's no editor, no factchecker, there's no producer.
(23:54):
You can anyone can just say whatthey want. There's no barrier to
entry. So that influencer modelof news, most people, you know,
not well. A lot of people aregetting their news from
influencers now online. More andmore people getting their news
from social media. I mean,there's no verification process,
and so that's that's a wholedifferent model from the model
that we're used to with news,where people assume, expect,
(24:15):
that things have been factchecked. Of course, errors are
made, but there's a process,there are committees, there's
procedures, right? And we don'thave that with the current
online vine. There was a surveyfrom UNESCO, the top influencers
on Instagram, Tiktok, and theyasked them if, if they checked
their content before they put itout. And they said, No, think
90% or something said, said, No,we don't really view our role
(24:38):
as, you know, as being relevantdoesn't have to be factual, and
we're just trying to influencepeople with with interesting
content, right? And they don'tsee themselves as a purveyor of
accurate news, and that's awhole different model. So
Scott Allender (24:57):
you've said so
many important things in there,
I want to pull. Some of it aparta little bit farther, if we can,
because you talked aboutsusceptibility, and we're all
susceptible, and there'ssurveillance capitalism and
targeted marketing according towhat you engage with. And that's
a real problem. We've talkedabout that on the show. You also
hit on the group of people whoare most likely to gravitate
(25:18):
towards information that alignswith what they want to be true
about the world. And I see thata lot in the sort of coaching
work that I do as well, thatthere are some people who are
really more prone to that, andsome people who are less likely
to that, they have a moreinherent sense of doubting
that's healthy. Of like, I'mgoing to challenge that idea
that I've seen, or I'm going toexplore it. I'm going to look
(25:39):
for other avenues. So you talkabout this thing, wherever you
fall on that spectrum. You talkabout this idea of psychological
vaccination againstmisinformation, that you can
inoculate yourself. So can youstart pulling that idea apart a
bit further for us?
Sander van der Linden (25:53):
Yeah. And
so that is based on this idea
that, you know, there isvariation in susceptibility.
We're all susceptible, but somepeople are more susceptible than
others. Some some factors haveto do with with the brain or
people's personalcharacteristics, but sometimes
it has to do with theenvironment in which you find
yourself, and that kind ofdetermines susceptibility,
(26:15):
right? Then there is howmisinformation spreads on social
media, and it turns out you canactually use models from
epidemiology that we use tostudy viruses, to study how
information spreads, and theyactually fit pretty well. I
mean, they're simple models. Youknow, people spreading
information is slightly morecomplex than than than a virus
replicating itself, but, butthere's a really close analogy
(26:35):
there that those models actuallydo really well at predicting
how, you know, viral rumours tofuse. And so we started
wondering, well, ifmisinformation spreads like a
virus, is it possible to theninoculate or vaccinate people
against it? And this sort ofresearch goes back to the 1960s
where people have been askingthe question of whether it's
(26:57):
possible to to protect. And thiswas actually in the context this
psychologist McGuire, was reallyinterested in this idea, because
the US government, this was atthe time of the Korean War, and,
you know, there were prisonersof war in Chinese prison camps
(27:17):
at the time that voluntarilystayed behind after the war. And
the US government was concernedthat they had been brainwashed,
and, you know, that they weregoing to explore communism. And
of course, reality turned out tobe much more complex than that,
but, but they didn't know, youknow, at the time it was going
on. And so their their wholeimpetus was like, Okay, for the
next generation of soldiers,we're going to give them more
(27:38):
facts, and we're going to teachthem about all the things you
know, why America is so greatand and if they ever get, you
know, captured again, they won'tbe confused about, you know, why
America is is right. And McGuirekind of said, I don't think
that's the right approach. Ithink what's happening is that
when there were captured andpeople were attacking their
(27:59):
belief system, they had nomental defences. And that is the
problem, that that what you needto do is actually simulate an
attack on soldiers, so to speak.
You know, give them a sense ofthe types of threats that they
might be facing in the future,then refute those in advance, so
that you give them theammunition to build up
(28:19):
resistance over time. And that,I think, was a very powerful
idea. I mean, he never tested itwith soldiers or anything like
that. In fact, when he tried todo a survey in his class, there
was actually mixed support forcapitalism and communism. So it
wasn't about value judgments,about about these things, but he
wanted to know the process. Andso he did some, some studies
(28:40):
that were interesting. But then,yeah, then, then he left that
research for what it was. And soone day, I became fascinated
about this idea of, of whetheranyone's already looked into
that. And I came across, in thelibrary, came across an article
from the 60s, where, where, youknow, he, of course, McGuire, I
didn't have the internet at thetime. He didn't know about
(29:01):
epidemiological models. Butactually, let's take that basic
idea, that metaphor, and, youknow, let's, let's test that
empirically in, you know, in the21st century. And that's kind of
how, how it evolved, and how webuild out this theory of
psychological inoculation or prebunking, instead of debunking so
how does it work? You know, itjust follows the vaccination
(29:23):
analogy. So the idea is thatinstead of just giving people
facts, you actually give peoplea weakened dose of the types of
misinformation they might see inthe future, or the techniques
used to produce misinformation,and you deconstruct, neutralise
to pre bunk them in advance, sothat people become more immune
to it in the future. So just asvaccines introduce a weakened
(29:43):
pathogen into the body thattriggers the production of of of
antibodies to help conferresistance against future
infection, it turns out you cando the same with information and
the brain. And you know, thebody's looking for right
antibodies are made, and it'slooking for potential. Invaders,
and the brain does the samething, like when it comes to
threatening information, thebrain's looking for, you know,
(30:04):
what's threatening me? And so itbenefits from seeing lots of
micro dose examples of what'smanipulation, what's not
manipulation, so it can helpdiscriminate better between the
two. Can
Jean Gomes (30:16):
you give us an
example of something that you
know listening right now youcould give us, like, an
inoculation against some Yeah,yeah, information
Sander van der Lind (30:25):
absolutely,
yeah, no, it's a good point. So
because, you know, does sound abit abstract, I'll make it very
concrete. Now I'll choose a nonpolitical example. Because, you
know, sometimes, you know,people might, you know, say, oh,
but you know who determines whatinformation is correct and not
correct. So let's sidestep thatfor a second, and let me give
you an example that that's moreabout the techniques that are
(30:46):
used in in disinformation. Andone of those is the the false
dilemma technique, which is oneof my favourites, not to use,
but to study right? Which is theidea that you present people
with with two options, while, infact, there are many more. But
the whole goal is that you tryto take out all the nuance, and
you you try to breed extremism.
And so, for example, somebodymight say, now, in the context
(31:09):
of the US, you might say that,you know, oh, if you're, you
know, if you don't supportautomatic rifles, you're against
the Second Amendment orsomething like that, right? So
you can be, you can be asupporter of the right to bear
arms, but, but maybe automaticrifles in schools is a little
too much, right? So there,there, there's a, there's a lot
(31:31):
of nuance there, but you portrayit as a false dilemma. Or, you
know, if you support thiscountry, then you're against
this country in a particulartype of conflict. Actually,
maybe you sympathise with bothcountries, right, and the goals
that they're trying to achieve.
So politicians love to use falsedilemmas, and disinformation
producers do that too. So theinoculation. So what's an
(31:53):
inoculation in this context, youhave to find a weakened dose
that is harmless but generatesthe relative resistance that you
need. So we produce these videoswhere we actually start to start
using popular culture examples.
So we expose people to a clipfrom Star Wars. Are you guys
Star Wars fans? I'm not sure. Alittle bit, yeah, little little
(32:14):
bit. There's a there's anepisode called Revenge of the
Sith and so Obi Wan Kenobi istalking to Anakin Skywalker,
who, spoiler alert, becomesDarth Vader, right? And so, oh
man, you gave it away. I gave itaway. And so, you know, he sort
of says, you know, either you'rewith me or you're my enemy. And
then Obi Wan goes, you know,only assist deals in absolutes.
(32:38):
And so then the narrator goeslike, oh, you know, don't use
these manipulation techniques.
Don't join the dark side. Andit's kind of a fun, non
political way of illustratingthe the false dilemma, right?
Everyone gets, gets thatexample. But then we test
people, we bombard them withsocial media messages in in
(32:59):
real, high stakes context thatthat use this technique, and we
find that people become betterat at recognising it, because
they've internalised theweakened dose, and now have some
some immunity, because it has atemplate. And it's the Star Wars
thing is funny, because it givesyou the template of either your
for or against. And that's sortof the, you know, the element
(33:22):
people remember, and then whenthey're bombarded by, it doesn't
matter what the content is, butit's the same structure, either
you're for or against, and itsort of rings a bell and says,
Oh, yeah, that's a false choice.
I shouldn't fall for that. Or atleast now I'm empowered to make
up my own mind. And that's,that's the idea behind pre
bunking. You
Jean Gomes (33:42):
Why is
disinformation so hard to get
rid of? I mean, even when youknow we know it's untrue, why
doesn't the virus leave us?
Sander van der Linden (33:52):
Yeah, so,
why doesn't the virus leave us?
Yeah, yeah. Well, in that sense,it is tricky, because once we're
exposed to a falsehood. Itintegrates into our memory
system. So our memory is kind oflike a network, like a social
network, so it has nodes, or,you know, concepts like food,
(34:12):
vaccines, right? And thenthere's links between the
different nodes, so there's livevaccines, you know, inactivated
vaccines, cities, foodcombinations of the two, right?
It's this vast network of ofconcepts and all the sorts of
links between them. And when youexpose people to misinformation,
for example, you know, let's sayJohn that, you know, I had take
(34:35):
out, you know, from the, youknow, Indian place around the
corner from where you live, andit had terrible food poisoning.
It's terrible. And I go on andon about like I was, you know,
it's terrible. Don't go there.
Then two weeks later, we werecatching up, and it's, oh, by
the way, you know, it wasn'tthat place around your corner.
It was a totally differentplace. It got confused. But now
every time you pass that place,you're going to think food
(34:57):
poisoning. And so. The problemis that what you see in
research, and this is what wecall the continued influence of
misinformation, is that peoplecontinue to retrieve false
details from their memory, evenwhen they've acknowledged a
correction in the in the formalsort of experiments that we
that's done around this, you putpeople in, let's say, a brain
scanner machine, and then yougive them some story about how a
(35:19):
warehouse burned down. And, youknow, there was some report that
said there was oil and gas cansin the in the closet that caused
it. And then later, the policechief actually says, you know,
oh, the the oil and gas thatwasn't the cause of the fire or
something else. But then you askthe, you know, people come out,
and you ask them to make, youknow, judgments, inferences
(35:40):
about lots of questions, Why wasthere so much smoke? And then
people say, Oh, it's because ofthe oil and the gas cans. And so
they completely, you know,forget that they've just
acknowledged the correction thatit wasn't the oil and the gas
cans. And so it sneaks in, itmakes friends with other things
that, you know, and it becomes agame of whack a mole. So, so you
can try to inactivate some ofthe nodes and links, and you can
(36:01):
see that in the scanner, in thesense that there's some, you
know, there's what we call aretrieval account, which is that
something goes wrong during theretrieval process of the
correction. So people areaccessing the myth in their
brain, but they're notretrieving the correction. For
(36:22):
some reason, there's also anintegration account, which is
that people are not integratingthe correction into their mental
model of how the world works. Sothey have the myth, but they're
not integrating the correctionaround it. And so it's a bit
neuroscience, see, but I thinkthat the bottom line is the
same, though. It's that, youknow, people tend to
corrections. Tend to be longscientific. They don't resonate
(36:43):
with people. They come too late.
So there's all sorts of reasonswhy people just forget about
corrections, and, you know, andsort of focus on the myth. And
in fact, when you have todebunk, you're always in a
strategically disadvantaged,disadvantageous position,
because you have to repeat themisinformation in order to
(37:03):
debunk it. So right? So ifyou're not debunking
effectively, what you're doingis you're repeating the
misinformation. People forgetthe correction. So fact checking
does work partially, if you doit well and you don't repeat the
misinformation too often. Somepeople like to call, we call the
truth sandwich, which isbasically, you start with facts,
you only repeat the necessarymisinfo once, and then you layer
(37:24):
on the facts again at the end tominimise any risk of repeating
the misinfo. But even in thebest case scenario, you're only
partially doing the undoing thedamage. There's a famous saying
from the legal context as well,when a jury hears something
about a defendant they weren'tsupposed to during a trial, and
the judge says, No, disregardthat you can't unring a bell.
(37:44):
And that's that's kind of the
Jean Gomes (37:45):
point. Hence, the
importance of pre bunking,
exactly,
Sander van der Linden (37:50):
trying to
prevent people from encoding
misinformation in the firstplace is so much, you know, in
theory, so much more effectiveand desirable. And so that's why
we try to spend so much time ondoing that prevention is better
than cure.
Scott Allender (38:06):
Is the scale of
the current sort of propaganda
machine and false choice andmisinformation? Is it as big as
it feels to me at the moment,sitting here in the US and you
mentioned at the beginning ofthe show the sadness you feel
around meta, doing away with itsfact checking. I'd love to sort
of get your thoughts on on thescale of the problem and kind of
(38:28):
what's going on in yourperspective with, you know, like
meta doing what they're doing,for example.
Sander van der Linden (38:33):
Yeah, I
can answer that question. I
think you know, because youmentioned meta, I think one, one
thing that disappoints me is thepoliticisation around this
topic, right? And it's becomethe idea that fighting
misinformation, ormisinformation itself becomes
fused with, with with, you know,party or identity in some way.
(38:56):
That's a sad state of affairs. Imean, everyone should care about
facts and truth, you know,including businesses and
leaders, and we all have othermotives in life. You know,
people have social motives. Theyhave reasons why they prefer not
to tell the truth or endorsesomething that's not entirely
true, and that's true for peopleas it is for businesses. There
(39:16):
are other motives, but at theend of the day, we should all
strive, or do our best, to rallyaround the truth and Mark
statement that fact checking isis biassed or didn't work for
him. It's a bit disappointingbecause that's not what the
research shows. I mean, therewas just a paper out this week
that looked at in the US membersof Congress, they get they get
(39:40):
fact checked at equal rates.
Democrats and Republicans getfact checked at about equal
rates. So it's not thatRepublicans are being singled
out for fact checking on theFacebook platform. You know, he
wants to move to communitynotes, which is a sort of
crowdsourcing fact checking and.
Which I think is not a bad idea.
I mean, this is a this is a niceidea. There is wisdom in the
(40:03):
crowd. So it's a kind of astatistical artefact, that if
you have large crowds of peoplethat are diverse in nature, then
you're pooling all of thatknowledge, right? All that
wisdom, some peopleoverestimate, some people
underestimate. But on average,you're actually getting pretty
close. But what's interesting isthat when you look at the
research, the ratings fromregular bipartisan crowds almost
(40:25):
perfectly converge with those ofexpert fact checkers, and so
it's not the case that there'ssome elite biassed fact
checkers, you know, tellingpeople what's going on. They're
giving the same rating asregular bipartisan crowds when
they're in the mindset of what'saccurate and what's not. And
then you can say, Oh, we canreplace one with the other, but,
(40:47):
but that's not necessarily true,because by itself, community
notes is not enough. By itself,fact checking is not enough. We
need all of it, not, you know,not less of it, which is
basically what what he's doing.
And that comes to your questionabout volume, because yes, there
is a the volume has increaseddramatically. It's difficult to
(41:09):
put specific numbers on it,because you would need to have
access to the archives of socialmedia since their inception. A
lot of that data is private, sowe don't know for sure. We can
only see public snapshots, but Ithink researchers are in fair
agreement that there's moremisinformation. Now, if we adopt
(41:32):
a broad definition of not thingsthat are that are 100% false,
but if you also include thingsthat are misleading, there's a
lot of it. The volume is bigger.
It's reaching people in moreways, right? It's, it's your
phone, Snapchat, you know,Tiktok TV. There's some more,
many more channels in throughwhich it can reach people. What
people disagree on. It's not somuch the the increase in volume.
(41:54):
I think, for example, there'smore conspiracy theories out
there. They're easier to findthan ever. But I think people
argue about the persuasionelement. So, so the the relevant
equation here is volume, youknow, or exposure times
persuasion. And people argueabout the persuasion parameter,
like is, has that gone on, up,gone up, or people become more
(42:16):
susceptible over time? Andthat's not so clear. And so, you
know, I would say, Yeah, the oneside, there's more volume
susceptibility people argue overI would say that, you know, we
find ourselves in moresusceptible situations when we
go online, where we're in echochambers, when we're in filter
bubbles, when there's a highlevel of toxicity of online
conversations, when we're beingtrolled. So, so I think when
(42:40):
we're being overloaded withmisinformation. So I think that
that sort of persuasion factordoes go up when we're put in
situations where it becomes moredifficult for us to control our
own biases or defend ourselvesfrom targeted attacks based on
our digital footprints that wemight not know about. So, you
know, I'm my reading of thatscience is that it depends, but,
(43:01):
but, but the persuasion can, infact, be, be higher than it was
before, depending on the groupthat we're talking about and the
situation they find themselvesin. And that's still large
groups of people, large enoughto, let's say, influence an
election.
Jean Gomes (43:19):
So when in foolproof
you talk about, you know, if
you're going to gain immunityfrom from this misinformation,
you have to recognise that itcomes in multiple strains. You
kind of pull this metaphor andyou make the distinction between
fake news and the buildingblocks of misinformation. Me
talk about that for a moment.
Yeah.
Sander van der Linden (43:38):
So, so
one of the challenges with fact
checking is that you know youcan't check it every single
claim. And in fact, you knowit's true most news is, in fact,
checked because there aren'tjust the resources to go at the
level of the claim. So with prebunking, you know, if you do
this at a or inoculation at thelevel of a claim, you run into
a, you know, not dissimilarproblem in that you can't pre
(43:59):
bunk every possible thingsomebody might say in the future
that's false. But what you cando is look at what are the
predictors or the buildingblocks of disinformation, more
generally, over time, what arethe recurring themes and tactics
that people have used and again,whether it's a government or the
or tobacco industry or any typeof entity, a lot of the
(44:22):
techniques are or Nazi Germany,right? So a lot of the
techniques are the same, andthen it actually becomes
possible to predict whatmisinformation people might be
exposed to, and how to inoculatepeople against that. And some of
these building blocks includethings like conspiratorial
language, emotionalmanipulation, polarisation, you
(44:44):
know, creating us versus them,mentalities, trolling, which is
huge during during elections. Infact, if you look at what Elon
Musk is doing with the with theUK, it's a classic example of
trolling, right? But if youdon't know. That you're going to
keep responding, you're going tofeed into it, and you become
duped by by the technique. Andso that's, you know, I think
(45:08):
that's kind of a famous exampleof trolling. But actually
trolling is very common inmanipulation more generally.
Then, you know, there's thingslike impersonation, you know,
faking, you know, fakingexpertise during a les pandemic,
fake doctors peddling fakecures. But also, you know,
(45:30):
there, you know, there werecases where, again, you know, in
public health, famous cases, andyou know, courts ruled on this
in the US that the tobaccoindustry used, you know, fake
experts with the white coat totry to sell people cigarettes,
right? That's, that's alsomanipulation. And so that, you
know, that technique is recycledacross lots of different
domains. And so what we do withthe pre bunking is we try to
(45:53):
tune people to these techniques.
To give you another one of myfavourite examples is, actually,
you know, there's a lot of talkabout how the vaccine, the COVID
vaccine, you know, is going tochange people's DNA, all the
conspiracies about, you knowabout that stuff. You know new
technology. You know, you knowmRNA vaccine, but, but actually,
of course, people can havequestions about new technology,
(46:14):
right? That's, that's totallyfair. But if you go back to the
1800s to the development of thefirst modern vaccine, which was
Edward Jenner's cow pox vaccineagainst smallpox. If you look at
the artwork for the 1800s I didthis in presentations. I show
people paintings where basicallycows and others were sprouting
out of people's heads andmouths. And the whole idea was
that if you took the cow poxvaccine, you were going to turn
(46:36):
into a human cow hybrid. And sothis idea that a vaccine is
going to change your DNA evenwhen it didn't have the word DNA
is 200 years old, and it's justrecycled over and over again.
And that's the idea ofinoculation, pre bunking, that
there's these building blocks ofdisinformation. And you know, we
lay out six in one of ourinterventions, and we give a
(46:58):
kind of an acronym to helpremind people, but it's always
the same with conspiracytheories. People say, okay, but
this one's true. And so sure,you know, sometimes a specific
theory can have merit to it, butthe psychology of conspiratorial
thinking is is fallacious in thesense that it's a predictable
narrative. There is somebodyplotting something behind the
(47:21):
scenes. They're having evilintentions. There's some
persecuted victim. In fact, inexperiments, we give people
these ingredients, and we say,can you come up with the
conspiracy? People come up withbeautiful conspiracy theories,
very elaborate, using thoseingredients. And it's always the
same ingredients. There's acelebrity is either dead
somewhere, but also still alive.
Avril Lavigne is a clone, right?
You know, Tupac Shakur is livingit up in an island somewhere.
(47:44):
It's always the same thing. Andso when you train people on
these things, even when peopleare kind of like, yeah,
Watergate and so on, or, youknow, Epstein, you know, it's
kind of fishy. Sure, nothingwrong with keeping an open mind,
right? But when you show peoplethe ridiculousness of how it's
always the same sort of, youknow, structure. People start
thinking, hey, maybe I'm beingduped by by the fact that it's
(48:07):
somehow always the samenarrative, just in a different
context. And people just neversee it all together. So that's
what we try to do, is put ittogether for people. And so
those are the building blocksthat that I talk about in the
book. And it's interestingbecause, you know, the the fact
checkers intuition is actuallyslightly different. It's that
you have to go deep andcontextual about a specific
(48:28):
claim. And you know, you canonly figure out what's true or
not by by looking at at a claimand investigating its truth,
sort of value. But I think wefind that at the level of the
building block, you can go muchbroader and you have much more
predictive power. And sure, it'snot going to be 100% to make a
(48:48):
joke about the book foolproof,right, for for every single
instance, but on average, thisis a pretty solid heuristic,
because they're using multiplecues. So in our in our
inoculation games, we don't givepeople one cue, for example,
emotional, emotionallymanipulative language, which is
a very popular one online forfor misinformation producers,
but sometimes it's true, a truestory could be emotional. Right
(49:11):
now, if a true story is highlyemotionally manipulative, you
can question whether it shouldactually get the status as true,
or whether it should be, youknow, true, but manipulative
enough, which is kind of mypreference. And so I don't
really mind if, if people becomea little bit more sceptical of
content. That is, that is kindof, you know, has some truth
value, but is actually prettymanipulative. I think that's a
(49:33):
good thing. But, you know, ifyou want to be technical about,
you know, it should be that ithas multiple cues. That's what
we do in the game. We teachpeople about conspiracies and
fear mongering and impersonatingfake doctors, and there's at
least six. And so models show,when you put all of these cues
together, the probability ofcorrectly identifying you know,
(49:55):
future disinformation that youhaven't seen before actually
goes. Pretty high. Mean, we'retalking, you know, not 100% but
we're talking, you know, 80% orsomething, which you know, if
you look at the research onhuman lie detection, if you get
above 50% that's a hugeaccomplishment. And so, you
know, I think it's probably thebest we can hope for evolving
Sara Deschamps (50:15):
leader friends.
If you're curious to get moreinsights directly from our
hosts, consider ordering John'sbook leading in a non linear
world, which contains a wealthof research backed insights on
how to see and solve ourgreatest challenges. As well as
Scott's book The Enneagram ofemotional intelligence, which
can help you unlock the power ofself awareness sustainably in
(50:36):
every dimension of your life.
Jean Gomes (50:41):
What is the kind of
spread of motivation for doing
for spreading misinformation?
Why did Donald Trump, forexample, want people to think
that injecting bleach againstCOVID was a sensible idea? What
is it? Because it's not just onemotivation for doing this stuff.
What is it?
Sander van der Linden (50:57):
Yeah, so
there's been some work looking
at at what motivates, you know,influential elites to spread
disinformation. And it's, it'svaried. So there are some top
motives, obviously. One, perhapsunsurprising, is, is political,
all right? So people want togain political currency by
spreading misinformation. Thus,the second popular one is
(51:20):
financial. People want to makemoney off of it. You know, this
is very popular in the wellnessindustry. There's a lot of
grifting going on in wellness,and includes, you know,
celebrities and you know GwynethPaltrow, I'm sure you know,
wonderful person, but when shetells people about infrared
saunas, you know, and kombucha,you know, shakes curing COVID. I
(51:42):
mean, that's just dangerous,right? And so and so, that's
that's and, and it's hard toinfer people's intentions,
right? That's the that's thetricky. The tricky part. I'm not
saying anything about Gwyneth inparticular, but the wellness
industry in general is a isoften about a financial grift,
whereas the Trump top stuff ismore about about politics and
(52:04):
power and control and influence,and then sometimes it's a
combination of of politics andmoney. But perhaps
unsurprisingly, those are thethe key motivations growing your
audience. Often what you find isthe the social media element,
right? It's 100% to blame on theperson, maybe on politicians who
(52:24):
have responsibility andaccountability and know what
they're doing, probably. But forcertain influencers, the
journey, if you look at theirjourney, they don't start out
doing a channel duping people,right? They start out maybe
with, you know, a guitarchannel, seeing that guitar hang
above your behind you. Scott,right? So they're doing a bit of
music, right? Yeah. And then,you know, they get a few clicks,
(52:48):
a few likes. But then, you know,randomly, they're just
frustrated one day, and theystart talking about an
alternative medicine. And then,whoa, they're getting a lot of
engagement now, a lot of likes.
Maybe I should, you know, starttalking more about alternative
medicine, right? And then, like,Oh, I'm going viral. Maybe I
should start talking aboutconspiracy theories and and
that. And then, you know, beforeyou know it, I don't want to say
(53:09):
that's been, you know, Joe Roganor Jordan Peterson's journey,
but, but certainly, you seepeople evolve from being
somewhat reasonable people to,you know, selling all kinds of
nonsense now, because, becauseit's, you know, that's what the
algorithm and the incentivestructure is rewarding on social
media, and that's what we callthe sort of the perverse
incentives of social media. Andwe've done research on that, you
(53:30):
know, what predicts virality,looking at millions of posts
across platforms, and the moreextreme, the more polarising,
the more dunking on the otherside that that's what gets
engagement. That's what getsclicks. So that's where I think
sometimes, and don't want toexcuse, take responsibility off
of people completely, but Ithink it's an interaction right
between the motives ofindividuals and and social media
that sort of creates this, this,this model. And yeah, the last
(53:55):
thing I'll say, Jean, to answeryour question, there is this
distinction we make betweenbelief speak and factual speak.
And politicians increasinglylike to use belief speak because
it seems authentic to peoplethat you know, whatever the fact
doesn't matter. They're beinggenuine, authentic about their
feelings and and what theybelieve to be true about the
(54:16):
world. And they have theaudacity to come out and say it,
and they find that people rewardthat more so than what the facts
are. And so we're operating moreon a belief, gut feeling based
model than a fact based model,which is a bit worrying when it
comes to politics.
Scott Allender (54:36):
So what are the
leadership implications here? So
as people lead teams, you know,in a world of increasing
polarisation and shifts inbeliefs and all that, they have
to manage, you know what? Whatare the tips and sort of sort of
ideas you have for leaderslistening on on some of these
watch outs to sort of keep theircultures free from the sort of
(54:57):
perils we talking about. I.
Sander van der Linden (55:00):
Yeah. So
I would say, you know, leaders
are, in fact, in an incrediblyimportant position, because they
can help determine what thevalues of the team are, what the
goals of the teams are. And, youknow, in any given situation, I
say you want to make sure thatthat you're incentivizing
(55:21):
truthfulness, trustworthinessand accuracy, rather over things
like deception, lying,manipulation. And everyone feels
kind of intuitively, that'strue, but But in practice, you
know, it can be, it can bequite, quite tricky. So of
course, you know team leadersare in a in a position to
(55:43):
inoculate or pre bunkmisinformation with their team
members. But, you know, prebunking or inoculation isn't
necessarily kind of a top downtool that people need to
enforce. I mean, it is somethingwe created, you know, for for
the people, by the people, forthe people. So it's the thing.
It's a thing you can share withwith others. And I think the
most neighbourly thing you cando is that when you know that
(56:03):
there's misinformation outthere, when you know that
there's false information, ortechniques, you know, help other
people spot it in in kind of an,you know, non judgmental way,
that's that's often, you knowwhat, what I try to do, you
know, you can incorporate itinto, into a team activity and A
team building activity. How doyou spot, you know,
misinformation, disinformation,manipulation, you know, do it in
(56:26):
a non political way. At first,you know, with, with, you know,
maybe talking about the level ofthe technique rather than
specific, you know, claims. Butalso, you know, some things are
in the company's interest,right? There could be strategic
disinformation aboutorganisations as well. There are
plenty of examples. Wayfair wascaught in a major conspiracy
(56:47):
that they were trafficking youngchildren in their furniture,
right? And they were justbombarded with the crazy,
satanic paedophile ringconspiracies. And how do you
prepare organisations for that?
I think one of the things thatcompanies underestimate law at
the moment is the strategic riskof disinformation about their
products, about their you know,what they're doing and how to
actually, you know, prevent. Youknow, go out and preempt that,
(57:09):
rather than just be reactive.
You know, you need to have aproactive approach to it. Give a
talk for a for, you know, forfor risk auditors who, you know,
you think, are thinking aboutthese things, but this
information wasn't reallysomething that they were were
factoring in. And so I think,yeah, when you're talking about
(57:31):
teams, you know, you want toprepare your team for for
misinformation, whether that'sabout the company or the
organisation that you're workingwith, or about things that are
going on in the in the worldmore generally. I mean, let's
take a topic that's that isdebated. You know, diversity and
inclusion is relevant to everyfirm, right? It's, it's a
(57:53):
prickly issue at the moment,with companies everywhere, and
team leaders are in a positionto say, okay, but what's, you
know, what are legitimatequestions, concerns that we can
ask and watch this plane,misleading or disinformation
about, you know about what'sabout, what's going on in that
space. And sometimes you mightneed an expert, right? Or invite
(58:14):
someone who is an expert to comegive talks. In fact, sometimes
you you might want to do a sortof community notes. Now, think
one of the problems that I seeis that there's always in this
topic that there's a sort ofperception of of bias, like, oh,
the leader is going to tell mewhat to believe. Or, you know,
we're going to hear from from,from this person or that person.
But you know, why notcrowdsource it? Then use, you
(58:36):
know, use the the wisdom of thecrowd as a team and see, see
what the the wisdom among theteam is in terms of particular
position, and then compare it towhat experts are saying. Could
be a fun exercise, right? Itcould be done anonymously. I
often do it with my students. Ihave them rate claims
individually, and then again asa whole group, and see who's
(58:57):
who's more accurate. And thensee what the experts are saying,
to get people more in touch withwhen, when are your intuitions
leading you astray? You know,when is the group useful? How do
we deal with with falseinformation, both in terms of
bottom up approaches and and topdown approaches? So, yeah, I
think, I think team leaders andleadership in general, it's all
(59:19):
about setting the example right?
You want to, you want to conveyfactual information as a as a
company. I mean, it's a hugerisk. One of the things the
auditor said, which during thisconference, which I thought was
interesting, is that if thecompany puts out information
that turns out to be false,either about the company
history, about its products, youknow, that's a huge cost to the
company. It's a PR crisis. Soactually, having having leaders
(59:43):
think about, you know, how canwe incentivize and promote
accurate information, is goingto be a key thing, especially in
a space where knowledge isbecoming contested. And you
know, how to how to phrasethings, and how to fact check
things, and how to. Internalteams that deal with the
veracity of information andclaims, I think could be, could
(01:00:04):
be there's probably going to bequite, quite an important topic
moving forward,
Jean Gomes (01:00:12):
and aside from the
book, is there anywhere else
people our listeners can getinformation on your approach
that would help them?
Sander van der Linden (01:00:20):
Yeah, of
course. So all of our
interventions, there are gamesand stuff. They're free. Are we
have videos, we have games, wehave educational material.
People can use inoculation. Dotscience is the website where we
house it all. It's all freelyavailable. Inoculation, dot
science kind of comes with thebook. People want to, want to
(01:00:41):
learn more. That's where theycan find some, some of the
resources.
Jean Gomes (01:00:44):
Wonderful. What's
next for you, Santa? What are
you doing to your research focusright now? Yeah,
Sander van der Linden (01:00:49):
right
now. We're doing a project on
deep fakes. This is a reallytricky one, because, you know,
in order to inoculate, you needto have these sort of stable
strategies, right? Whereas deepfakes, the technology is
changing so fast and and, youknow, if there's nothing to go
on, no audio, for example, youknow, how do you inoculate? And
so that is a challenge. So whatwe're trying to do is expose
(01:01:11):
people to more ridiculousversions of deep fakes, and kind
of build the sort of internaldeception monitor for, for, for
deep fakery, if you will, to seeif we can actually come up with
a pre bunking technique thatworks. We're trying to to
actually use the influencermodel for the purpose of pre
(01:01:33):
bunking. You know, maybeofficial organisations aren't
always the most excitinginstitutions to hear from.
Maybe, it's an influencer rightthat can deliver the message
best. You know, often you seethis with things like vaccine
hesitancy in religiouscommunities. Sometimes a
religious leader is actually thebest person to talk to people
(01:01:53):
about, you know, public health,and not the CDC or or the NHS.
And so we want to do the samewith inoculation. Maybe people
don't want to hear from, from,from defence of dark arts, you
know, Fanta Linden, and so, youknow, maybe people want to hear
from Taylor Swift, I don't know.
And so, Taylor, if you'relistening, you know, we're open
to, I actually, you know, she'son
Jean Gomes (01:02:16):
every week. So she's
on every week, yeah.
Sander van der Linden (01:02:20):
So, so,
yeah, so, who's the, who's the
best, you know, who's the bestcommunicator, in this sense,
that's, that's a big of aquestion. And, yeah, the other
thing is, is coming back to thevolume question? Actually, it's
very difficult to to measurewith limited access. So we're
trying to figure out, you know,how can we measure and track the
volume of this informationthat's out there in a very
(01:02:42):
technical way. How tooperationalize that as well as
you know? How can we leverage AIto do good things? You know? Can
we automate pre bunking? Can weautomate fact checking? And,
yeah, so those, those are someof the things that are at the
top of my mind at the moment.
Well,
Scott Allender (01:03:01):
Sandra, I could
talk to you about this all day.
I wish we had more time.
Honestly, this is so important.
It's so timely, so relevant andso useful. So thank you for for
sharing a bit of your wisdom andresearch with us. And folks, if
you haven't already gotten yourcopy of foolproof, please do
yourself a favour and order thattoday. Thank you again. So
(01:03:24):
appreciate it, and until nexttime, folks remember the world
is evolving. Are you?