All Episodes

June 25, 2025 21 mins

You reckon you’re immune to propaganda and disinformation, right? A critical thinker who sees through the rubbish.

I thought the same - until I realised I’m way more likely to believe something if it backs up what I already think.

That’s not a personal flaw. It’s how we’re all wired. But we need to get better at spotting it, because it’s fuelling polarisation and making it harder to have real conversations with people we disagree with.

And with AI making it even harder to tell what’s real, it’s only going to get trickier.

If we want to tackle big issues together, we’ve got to become more sceptical, more media literate, and better at asking:

Where’s the evidence? What’s the source? Is there consensus?

This episode is designed to help you do exactly that - understand and analyse the information out there.

In this episode I talk about:

  • What confirmation bias actually is
  • A proper breakdown of propaganda, misinformation, and disinformation
  • How extremist disinformation groups take hold
  • Real examples of how disinformation can (and does) cost lives
  • How it delays action on the things that matter most
  • What to look for when spotting disinformation
  • The six main tactics companies and governments use to spread it
  • And how we can start pushing back


Giveaway! This week, we're giving away copies of Six Conversations We’re Scared to Have by Deborah Francis White - a practical guide to having honest and respectful discussions on tough topics like politics, climate, and social issues. Join me over on Instagram to enter.


Find our full podcast via the website here: https://www.nowthatswhaticall.com/

Instagram: https://www.instagram.com/nowthatswhaticallgreen/

You can follow me on socials on the below accounts.

Instagram: https://www.instagram.com/briannemwest/

TikTok: https://www.tiktok.com/@briannemwest

LinkedIn: https://www.linkedin.com/in/briannemwest/

For our latest big project, find out more about Incrediballs here: https://incrediballs.com/

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:05):
Kyodo Kaitaki and welcome to NowThat's what I Call Green.
I'm your host, Brianne W, an environmentalist and
entrepreneur trying to get you as excited about our planet as I
am. I'm all about creating a
scientific approach to making the world a better place without
the judgement and making it fun.And of course, we will be
chatting about some of the most amazing creatures we share our
planet with. So if you are looking to

(00:26):
navigate through everything green or not so green, you have
come to the right place. You're immune to propaganda,
disinformation, and misinformation, right?
You think critically, you can see through nonsense.
Unfortunately, I guarantee that you were wrong and I thought
that recently. I'm pretty cynical.
I'm pretty skeptical of most things I see online.
So I thought I was immune for began it.

(00:47):
But it turns out that I am not. Because if you show me something
that reinforces my beliefs, I'm way more likely to believe it,
even if there's no evidence for it.
And that is true of all of us because that is how we are
wired. But we need to get way better at
this. This is one of the reasons that
we are becoming so polarized, soextremists in our viewpoints,
and why we can no longer have debates that people we disagree
with. We not only need to get better

(01:07):
at this because we have a whole lot of problems that we need to
solve together, but also becausewith the rise of AI and the
horrifying fact that people can't even tell that that great
white shark is not real, the onewhere she's hugging it in the
water, it's clearly AI, guys. With the rise of AI that it's
just going to get better and better and better.
We are soon not going to be ableto tell what's real and what's
not. So we need to start being more

(01:28):
skeptical, more media literate, and better at looking for things
like consensus and sources and evidence.
One of my favorite examples of this is an example of
misinformation in two different ways.
I'm sure you know of the movie War of the Worlds.
So it's actually a book and in 1938, a radio broadcast, a part
of it like a news bulletin, likea breaking news.
Aliens are invading, and people didn't take it very well.

(01:50):
They went a little bit crazy. They got a little bit panicked.
There was rioting in the streets.
There was people caught holding bottles of poison for, you know,
better that than the aliens. Apparently hospitals were filled
with people and everybody assumed it was the end of the
world. Except there's absolutely no
evidence for that at all, because the story used to talk
about how quickly misinformationspreads is also misinformation,

(02:13):
which is kind of hilarious. None of that is true.
There is no evidence for any kind of widespread panic.
There was absolutely a radio bulletin, but the vast majority
of people used their heads and realized that's a book.
Perhaps the least extreme example, last year there was a
tweet posted on X or Twitter, whatever, and it was talking
about how volcanoes released more CO2 in one year than all

(02:34):
human activity entirely. It spread freaking fast and it
was over 120,000 views before Nemo.
Scientists were like no that's not the case at all.
Have some evidence. Unfortunately, there are still
thousands, probably 10s of thousands of people who still
believe that to be true, becauseI saw one thing on Twitter and
ran with it because it reinforced A belief that we all

(02:54):
really want to believe, right? Which is that climate change
isn't our fault because wouldn'tthat be a nice reality?
Unfortunately, it's not the reality.
When they were provided the correction, that one tweet had
already made it onto radio stations and into mainstream
media. And because they didn't do any
fact checking, that belief now appears in most comment sections

(03:15):
on almost any video I make aboutclimate change.
And why do we do this? It's because our brains favour
speed of accuracy, right? MIT researchers and a bunch of
other people who have since looked at how misinformation
spreads, tracked rumours on Twitter and found that they
spread between 4:00 to 6:00, times faster than real
information. And that's because fake news on
misinformation tends to fall into one of three categories,

(03:36):
right? It's novel, so it's new and
therefore exciting. Or it's emotional.
Usually it generates some kind of rage or it reinforces those
people have, so it is much more likely to be shared.
That's called confirmation bias,right?
When we see something that confirms what we already
believe, we're therefore more likely to believe the thing we
see. Then you get the illusory truth
effect where basically the more you say something the more

(03:57):
likely it is to be true. And now we have a whole bunch of
people who believe that daddy long legs are the most venomous
spiders in the world despite thefact that they are definitely
not. At the beginning of this video I
used 3 terms that are used interchangeably but do actually
mean different things. Propaganda, misinformation, and
disinformation. Propaganda is the evil one

(04:17):
right? Everybody knows that governments
use it to manipulate people inside and outside their
countries, and they do, to be fair.
But also, companies use it. Tobacco is a great example
because scientists knew that tobacco was bad for us way
earlier than became public knowledge.
Because tobacco companies fired back with propaganda designed to
make you think that tobacco was not only safe but had lots of

(04:40):
good effects too. That is the same playbook, of
course, that oil companies use when it comes to climate change
and microplastics. If a claim's main job is to push
you to believe someone else's agenda, that's probably
propaganda. Misinformation is something that
we are all guilty of, right? It's the thing that you're well
meaning family member says at Christmas dinner.

(05:02):
They forward you a study that they like the headline of.
They don't have any intent to deceive, but they just believe
it to be true and they wanted toshare it on.
It's just bad facts doing the rounds, right?
We all do this to some degree because who fact checks
everything they say? You, you can't, right?
There's so many beliefs inside that we don't even necessarily
know about. It would be impossible to fact

(05:22):
cheat everything. And that's where intent comes
in. Because disinformation is like
misinformation's evil twin. Disinformation is done when they
know it's incorrect. They are still spreading it
anyway because they want something.
This is done by companies, governments, politicians,
lobbyists, you name it. A lot of people do this to try

(05:42):
and achieve a goal. Obviously there's overlap
between disinformation and misinformation.
How do you know if someone knew and doesn't even really matter
if they did anyway? Yes it does.
Disinformation is intent to deceive.
Misinformation is just being wrong.
We're all wrong quite often. There is no harm in being wrong.
There is a lot of harm in deliberately lying to people
because you want to achieve something.
But why does any of this matter?Right?

(06:03):
We all know that everybody, the Internet is full of nonsense.
Well, have you noticed how things are a little bit more
difficult lately? People are more extreme in their
views. We are polarized.
The left is much further apart than the right, even though
there's a really just stupid categorizations of people
because we're far too complex togo into just two buckets.
People parrot things they see online without any idea whether

(06:25):
it's true or not. And even when things have been
debunked by authorities and experts in the space, people are
much more likely to be like, no,that's not true because we don't
trust experts anymore. The information is the raw
material for every single decision we make.
It informs what we buy, what companies we buy from and people
who we support, who we vote for,the future we plan for.
And when you twist the information, you twist the
decision and therefore the outcome.

(06:45):
I know people, and I'm sure thatyou do as well, who are what you
would call open minded and tolerant and kind and
thoughtful. And they start to move along
this path of disinformation campaigns and they start to
believe things, but you think you're a little bit odd.
And it gets worse and worse and worse until all of a sudden
they've become part of some fringe belief group.
This is how you end up with those extremist groups.

(07:08):
It doesn't happen overnight. Each life builds upon the
previous one, making the subsequent one seem more likely.
And then you have a bunch of people thinking that Andrew Tate
is just misunderstood and reallyhe's a good guy.
And that leads to some pretty dark places.
Obviously the anti vaccination campaigns are a good example
here because this is where it kills people.
There was an outbreak of measlesand some are in 2019 that killed

(07:28):
83 people, mostly children. And this was because of repeated
vaccine disinformation campaignstaking the vaccination rate down
to about 30%. And these disinformation
campaigns are capitalizing really horrendously on the
tragic death of two children whoreceived the vaccine that had
been mixed improperly. Instead of mixing the vaccine

(07:49):
with water before they injected it, the nurses injected the
vaccine with a muscle relaxant. Unfortunately those two children
died and that started a campaignof vaccines killed children.
One of the people who of course made that worse was RFK who
travelled to Samoa in 2019 meeting with some of those anti
vaxes and further bolstering their message.
And something that made it even worse was naturally those

(08:10):
families with sick children tookthose children to traditional
healers. And those traditional healers
had been sold immunity water machines by Australian Wellness
companies that had absolutely noevidence or even logic behind
it. So they treated these sick kids
with this immunity water, which of course had no effect.
And unfortunately, by the time they took them to doctors, it
was often too late to treat those kids.

(08:30):
That is a very sad, very frustrating example of why it
really matters that we stop believing all the myths and
disinformation out there. And as an aside, I find it so
interesting, right? So many anti vexes talk about
big pharma being evil. And I'm look, they're certainly
not angelic by any stretch, right?
But they never talk about big, big Wellness, the Wellness

(08:51):
industry, which is 3 to 4 times larger than big pharma and more
profitable. So the people who have no
evidence behind them make more money from it are the ones doing
the good things. And yet the scientists who make
very little money on average, who will do an awful lot of work
and have a lot of checks and balances, they are the ones that
are villainized. Now.
It doesn't just cost lives, of course.
It slows down and stops all sorts of action, right?

(09:13):
There's so many examples I coulduse here.
I talked about tobacco earlier. Scientists knew that smoking was
carcinogenic back in the 1940s, and they tried to warn people,
but tobacco companies put out loads and loads of propaganda,
but were drowned out by those tobacco companies.
Same deal with climate change. Scientists knew about the
problem decades ago. In fact, oil companies did as
well. But then companies like Exxon
Mobil, Shivron, Shell, BP, they spent hundreds of millions of

(09:35):
dollars on PR to confuse people,delay action, and more recently
convinced people that it's too late to even bother doing
anything now. So.
And every month of delay adds about two giga tons of CO2 into
our atmosphere, which is about 40 years of Altiro's emissions
every single month that we delay.
That's the power of misinformation.

(09:55):
And of course it also effects democracy.
In the 2016 election, everybody talked about the Russian bots
that come from troll farms or those farms created about
126,000 Facebook posts in about 80,000 ads pushing memes of
Hillary Clinton plotting a Muslim takeover, spreading fake
headlines at the Pope, endorsingTrump, which he didn't, and

(10:16):
urging left-leaning voters to skip the vote and protest
instead. Which is quite a smart move.
I will give them that. Fast forward to the 2020
election and of course, it was all about how the election was
stolen. Facebook pages, talk show hosts,
podcast Bros, they all made-up that mail in ballots were Briggs
voting machine change votes. There was a slogan called

(10:37):
Hashtag Stop Steal, which made about 400 million impressions in
just six weeks. And Trump himself even repeated
it. In fact, he still does.
And yet a bipartisan Senate report later confirm there's
absolutely no evidence of fraud.Yet still about one in four
Americans believe that the 2020 election was rigged.
There is no evidence whatsoever.And a whole bunch of you like,

(10:58):
well, of course there was no evidence.
So what's your evidence that it was?
Is it just a gut feeling? When you are feeling defensive
about your belief, instead of getting mad, I urge you to just
dig into why and have a think about why you're so determined
to protect that belief. It's a really healthy thing to
do. And it's a thing that very few
people do because it's it's it reduces your own ego, right?

(11:19):
But the more you challenge your own beliefs, the more informed
you become, the more thoughtful you become and the less extreme.
Because at the end of the day, we all want the same thing.
We just differ in our opinion ofhow to get there.
That's OK. But of course all that
misinformation about stopping the steel is what lead to the
January 6th assault on the capital.

(11:40):
Misinformation can lead directlyto violence in our election here
in Altairo. Wasn't speed either.
There was anti Co governance Facebook groups and podcast Bros
again claiming that Labour had asecret plan to confiscate
private land and and all sorts of other nonsense.
And follow up polling has found that again, about one in four of
the undecided voters still believe one of those claims at

(12:03):
the time of the election. Fake news sticks and when people
don't really have anything real to point to, they just make
stuff up. There are countless examples,
but I don't want this video to go on too long.
What I want this video to do ultimately is to give you some
examples of what to look for. Broadly speaking, there's six
tactics that companies and governments and people use.
Number one is astroturfing, which is a bizarre tune.

(12:23):
It's got nothing to do with fakegrass, but it's when groups with
a very specific agenda will put on like a nice community mask to
hide who funds them, what ultimately their goal is.
The best example of this is, again, in the US, there's a
group called Citizens for Affordable Energy, which sounds
good, right? We all want affordable energy.
It's funded by the Koch brothers, who are of course I
will, deep in the fossil fuel industry, not renowned for being

(12:45):
good people. The billionaires behind Koch
Industries have funneled hundreds of millions of dollars
into anti climate change lobbying.
Another tactic is cherry pickingexperts and using them weirdly,
many producers froth over a fight because it draws more
eyeballs. So you see the same contrarian
scientists pop up everywhere, right?
There are three contrarian scientists that are used a lot

(13:08):
are way more than any other scientist in the climate change
debate. The biggest reason of course, is
there's so few scientists who have any expertise in relevant
areas that are actually anti whosay that climate change isn't
real. But you've got a geologist who
says it's CO2 is plant food. And before you say but it is,
yeah, it is, but it's a bit morecomplicated than that.
And then you have an engineer who just blame to the sun, but
they get equal air time with thehundreds of peer reviewed

(13:30):
climatologists who are actual experts.
They do not deserve to be in these conversations.
They don't have the knowledge ofthe expertise.
But I guess you see this a lot in articles and I've talked
about it before and there's an article on my sub stack about it
should you want to read some more in depth.
It's a good rule to remember that just because someone has
say, APHD in one area or as a doctor doesn't mean they have
expertise. Over here scientific expertise

(13:51):
seems to be very narrow. Then you've got a motive
language, and this is something people don't even know what's
happening, but it's very, very effective.
So groups come up with slogans like keeping the lights on jobs
versus the planet, pitching it as an either or or the infamous
Ute tax, which are designed to specifically fire you up by
positioning something as a fightand like ping you as a brain.

(14:14):
Before you even actually know what any of it's about, they've
already predisposed you to have an emotional reaction and you
don't even know why. Some organizations specialize in
manufacturing doubt. So if public opinion starts
going one way, lobbyists come out with things like maybe, but
I think we just need more research first.
They've not even said anything that originally did any
evidence. They've said something which on

(14:35):
the face of it seems fairly reasonable, and they say it even
when you have 97% scientific incentives.
Visual misdirection is another one that you'll be very familiar
with without even necessarily being familiar with the tactic.
Why do you think BPS logo is green and shaped like either a
flower or the sign depending on your perspective?
That's very specifically chosen to evoke good feelings about the

(14:57):
company, right? Despite the fact that they are a
fossil fuel company at the heart?
That is visual misdirection. You'd be surprised how powerful
colours can be. And the final tactic is what
you're seeing a lot of right now, which is outrage
amplification. I see fake news travels faster
earlier. So does outrage.
A memo from Mesa back in 2023 stated that posts that sparked
outrage or disgust travel about 25% further than a more normal

(15:19):
post. The louder is shouting, the more
reaction before the more the algorithm bursts it.
I mean in fact this is a well known marketing strategy.
If you want to stand out on TikTok, marking gurus will tell
you to say something controversial even if you don't
believe it, which is kind of gross.
So now you know the tactics, howdo you actually stop it?
Well that is the $1,000,000 question right?
People study human behaviour fordecades to try and understand

(15:40):
how we actually get through the lizard brain reaction and into
the thinking part. I'm actually doing some
philosophy papers because thinking about how we think is
weird sounding but actually really fascinating.
So a couple of tips. If you read something and it
sparks any kind of emotion, whether it's rage, delights,
whether it's feeling smarter because you were right, take a

(16:02):
deep breath and instead of sharing it or taking it as
written, go and find proof. Open a new tab and just run a
background Google search. Try and find the ultimate source
of the information. It's often a scientific study
that is misrepresented in the media.
This isn't a difficult thing to do.
It's a time consuming thing to do, which is why, you know, I
wouldn't expect you to do it foreverything, but big things maybe
fetching them. This is going to be more and

(16:23):
more important. Of course, as I mentioned, with
AI being more and more realistic, look for the funding.
Every study, every think tank, every Facebook report, somebody
funded it, right? And quite often there's public
information at the bottom of every scientific study.
There are funding disclaimers and in a proper journal, they
should be complete. They should also state whether
they have any conflicts. Now, just because they do or

(16:43):
just because they are funded by the organization behind it
doesn't necessarily mean that that information is not true,
but it is something to be aware of.
Funding can absolutely make a difference.
And indeed, and when you find that original source, does the
original source actually agree with the thing that you've read
so many times? I have read some amazing,
outrageous headline talking about how wonderful something is
or how this is a cure to cancer.And then you read the study and

(17:06):
it was, you know, done on five mice.
And the researchers themselves are like, well, you know, this
is interesting news, but it doesn't prove anything.
And yet the media headlines likecure of cancer.
And This is why people believe that the cure to cancer was
discovered decades ago and yet big farmers sitting on it.
This is why people don't believein science anymore because they
think it flip flops. When in reality, scientists are

(17:27):
some of the most annoying peopleto talk to ever because it's so
hard to pin down to something they need to be convinced with
evidence, whereas journalists will print something that makes
you want to click in it. And it's not their fault.
There's an awful lot of pressureon journalists, but our
responsibility as readers is to maybe a little bit deeper.
And when you can't do that, keepan eye out for consensus, right?
To use the climate change example again, if 97% of
scientists are for something, that's probably a good sign that

(17:49):
it's true. Whereas if you've only got one
or two podcast Bros talking about how it's actually just a
solar cycle, they might just be selling T-shirts.
Consensus isn't sacred. This is none of these are silver
bullets, which is just confusing.
But overturning scientific consensus does take a shit load
of evidence, right? It's not just a podcast
microphone, ironically. And then there's two more things

(18:12):
I think we should all do and #1 is curate your feed.
Most of us are on social media. And eventually it becomes a bit
of an echo Chamber of your belief systems, right?
Because you interact with thingsyou like.
And eventually, if you're afraidis anything like mine, it's
mostly just cute animals and environmental information.
That's great, but that's not really the real world.
So try and follow a spread of credible outlets, even ones that

(18:35):
you don't necessarily agree with.
So science communicators, political commentators, whoever
they are, just make sure you don't exist in an echo Chamber
of your belief systems. You can get rid of rage battles,
they're not helpful at all. But make sure there is a few
voices in there that you kind ofdisagree with.
And then finally, be responsiblewith your sharing.
How many times have we all seen someone share something we know
isn't true? All we've done it ourselves,

(18:56):
right? If you wouldn't stake your
reputation on that claim, you know, if you wouldn't stay in
front of your grandma at the dinner table, maybe don't put it
on your stories. Getting something wrong is
absolutely fine. We are all wrong at times at
admitting that as part of being human.
Getting something wrong is fine,right?
It's part of being human. We all get things wrong quite
often. The most reliable, intelligent

(19:17):
people on earth are the ones whoare the most willing to change
their minds and admit they are wrong based on new evidence,
which is a skill. If you do even one or two of
these things often enough, it kind of comes like muscle memory
and you become that cynical person in your group who people
run things by. I shouldn't actually say
cynical, really. I should say skeptical because
skepticism is largely the backbone of proper science

(19:37):
scientists. We don't believe anything until
there is evidence for it, and that's an important attitude for
all of us to cultivate. It's OK to say I don't know yet
or I don't have an opinion because I haven't seen any
evidence. The world would be a better
place if more of us did that. It's funny because even talking
about misinformation riles people up, which is what it's
intended to do, right? There is a link between our egos

(19:57):
and our sense of self and our belief systems.
So when you chip away at people's beliefs.
It feels inherently personal. So the important thing to do is
to do this when you're talking to someone about something they
believe that you don't think is true.
And so approach the conversationfrom a place of mutual
understanding. And because I think this is very
important, it is actually something I'm going to do a
whole other episode on in a couple of weeks time.

(20:18):
But until then, I have our next book giveaway for you.
Ratings at an all time low. We are losing the love and the
scale of reading and it's such ashame.
I was that kid who used to sit outside on the swings when I was
made to go and play outside and read a book.
My parents worried about me, I think.
But the fact that we're reading this is showing and things like
literacy rights and just generalknowledge.
So every week we're going to be giving away a couple of copies

(20:40):
of a book that was somewhat relevant to the podcast maybe.
But it is a book that we think you should read.
And this week it's 6 Conversations We're Scared to
Have, which was written by Deborah Francis White.
It's like a sharp, funny guide on how to have conversations
about what I've just been talking about.
You like race, politics, privilege climate change without
people throwing rocks, ain't you?

(21:01):
Which I think is a perfect fit for today's thing.
Next week I'm talking about mining the deep sea, which is
obviously a terrible idea, but here we are.
Anyway, Joda and I will see you next week.
And there you go. I hope you learned something and
realise that being green isn't about everything in your pantry
matching with those silly glass jars or living in a commune.

(21:22):
If that's your jam, fabulous. But sustainability at its part
is just using what you need. If you enjoyed this episode,
please don't keep it to yourselfand feel free to drop me a
rating and hit the subscribe button.
Kyoda and I'll see you next week.
Advertise With Us

Popular Podcasts

Stuff You Should Know
Ruthie's Table 4

Ruthie's Table 4

For more than 30 years The River Cafe in London, has been the home-from-home of artists, architects, designers, actors, collectors, writers, activists, and politicians. Michael Caine, Glenn Close, JJ Abrams, Steve McQueen, Victoria and David Beckham, and Lily Allen, are just some of the people who love to call The River Cafe home. On River Cafe Table 4, Rogers sits down with her customers—who have become friends—to talk about food memories. Table 4 explores how food impacts every aspect of our lives. “Foods is politics, food is cultural, food is how you express love, food is about your heritage, it defines who you and who you want to be,” says Rogers. Each week, Rogers invites her guest to reminisce about family suppers and first dates, what they cook, how they eat when performing, the restaurants they choose, and what food they seek when they need comfort. And to punctuate each episode of Table 4, guests such as Ralph Fiennes, Emily Blunt, and Alfonso Cuarón, read their favourite recipe from one of the best-selling River Cafe cookbooks. Table 4 itself, is situated near The River Cafe’s open kitchen, close to the bright pink wood-fired oven and next to the glossy yellow pass, where Ruthie oversees the restaurant. You are invited to take a seat at this intimate table and join the conversation. For more information, recipes, and ingredients, go to https://shoptherivercafe.co.uk/ Web: https://rivercafe.co.uk/ Instagram: www.instagram.com/therivercafelondon/ Facebook: https://en-gb.facebook.com/therivercafelondon/ For more podcasts from iHeartRadio, visit the iheartradio app, apple podcasts, or wherever you listen to your favorite shows. Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.