Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
There's a lot of misinformation that's developed for these extremely
cynical reasons. I mean, part of why we're here is
to talk about climate change, and here we see industries
spending massive amounts of money so that they can make
more money, and doing it purposely to confuse people, and
(00:22):
doing it in a way where they know they're going
to kill people and potentially wreck theater.
Speaker 2 (00:29):
Oh fucked.
Speaker 3 (00:37):
Welcome to I'm Fucking the Future, the show where we
learn about surprising and innovative ways of scientists, entrepreneurs, activists,
and even philosophers are fighting for climate crisis. I'm your house,
Chris Turney, and I believe that together we can unfuck
this mess. Let's get started.
Speaker 4 (00:56):
Weird Fucking the Future.
Speaker 3 (01:02):
Okay, you've been here a while. You've heard about the
bad a.
Speaker 1 (01:06):
Mass evacuation this morning along Australia's southeastern coast, with bushfires
looming and extreme danger ahead.
Speaker 5 (01:14):
A fierce heat wave is gripping parts of Europe, with
temperatures reaching more than forty degrees celsius after.
Speaker 3 (01:20):
Twenty four hours.
Speaker 2 (01:21):
All that remains our flooded homes and floating debris. On
Sunday and Monday, Mediterranean storm Daniel swept through eastern Libya,
washing away entire neighborhoods.
Speaker 3 (01:31):
But let's pause for a moment and consider an amazing
possible future. It's twenty fifty and I'm Chris Turney, a
retired climate scientist. Almost everything around me has been electrified.
Our homes, our work, are cars. People no longer live
in those harmful little suburbs. We live in small communities
(01:54):
where we know our neighbors, and it's just a short
walk to the grocery store. And if we do need
to go further, we can just use for clean, affordable
public transportation that's widely available across the world. Fossil fuel
companies are a thing of a past, and the people
who are working in coal and oil now working green jobs.
(02:16):
At school, kids learn about smog and wildfires, but not
because they need to learn how to survive them. They're
learning about these events in their history classes. Natural disasters
because of climate change have largely stopped, or at least
become less frequent and less extreme, And the world just
feels pretty damn good. While this might sound like a
(02:41):
total fantasy, it's totally within our reach. We're going to
have to work really fucking hard to achieve this vision,
but one of the most pressing challenges is that a
world is full of bad actors pushing bad information in
bad faith. Misinformation cast out on the urgency of a
(03:03):
climate crisis, and it distracts us from a real issue.
We need to take immediate action at speed and scale.
It's easy to find examples of how mis and disinformation
is influencing the climate crisis, like the idea a walkable
fifteen minute cities is some sort of plot for totalitarian control,
(03:24):
or that the wildfires in Mary last year were started
by energy weapons or space lasers, or that shifts in
the Earth's magnetic field and not human activity, is responsible
for a climate crisis. That one was promoted by Joe
Rogan and got millions of views on TikTok. All of
this is easily disprovable, of course, by any of the
(03:45):
world's thousands of climate scientists who look at the actual
facts and data and models all day long. But too
many people are believing these things, and that's all of
our problem. Today we're going to talk about how and
why misinformation came for a climate crisis and what we
can do about it. Our guest is Kaylen O'Connor. She's
(04:09):
a professor of philosophy at University of California, Irvine, where,
among other things, she researches misinformation and how false beliefs
spread between people. And Kaylen is not like other philosophers.
She's a cool philosopher. On her website, she's got this
beautiful photo of herself, her kids, and a chicken. It's
(04:30):
not the kind of headshot you see most academics posts.
They're normally quite formal. And I've been intimidating.
Speaker 2 (04:37):
I'm in a discipline which is like very male, very.
Speaker 1 (04:42):
White philosophy, even though it's in the humanities, has been
like the humanities discipline where it hasn't diversified as much
as other ones.
Speaker 2 (04:51):
So as a you know, as a woman, as a.
Speaker 1 (04:54):
Full professor, I sort of like to open the doors
a little bit and be like I'm a real person
to younger women who want to join the field, like
you can be a real person who So in this picture,
I have my three kids and I bring them along
to conferences sometimes and they show up one zoom with me,
and I tell everyone who will listen about my chickens.
Speaker 3 (05:16):
Yeah, those chickens are a pretty big part of her life.
Speaker 1 (05:19):
I am zoned to have up to three chickens, and
I have seven chickens, and I recently had to build
them a new enclosure because my neighbor was like, enough
with the damn chickens. They're all different breeds. They're very beautiful.
They are named Starfrost and red Velvet and Molasses and
cream puff and Oreo, mostly after desserts. That's a wolf.
Speaker 3 (05:41):
So you must be dragging your eggs on you? Or
do you give them out to you? Maybe apiece your neighbor?
Speaker 1 (05:46):
Do you give them a Yeah, I just I just
give them to neighbors if we have too many, And
like my neighbors and I, we have you know, like
someone brings over the extra lemons, someone brings over the
extra ginger cake, someone brings over the extra eggs.
Speaker 2 (05:58):
It's like a very nice little.
Speaker 3 (06:00):
It sounds lovely. It sounds lovely. In that Star's hollow
esque setting, it could be easy to forget the existential
crisis we're facing, but Kayla says she's rarely not thinking
about the environment. Part of that is Judo her upbringing.
Speaker 2 (06:16):
My dad was an environmentalist.
Speaker 1 (06:17):
I used to read Ranger Rick magazine, and my grandfather
wrote an article I think in the seventies about the
threats of human omission and possible emergence of climate change
as a result. So yeah, I've been worried about climate
change literally since I was five or.
Speaker 2 (06:35):
Six years old.
Speaker 3 (06:37):
Really, that's sometimes.
Speaker 2 (06:38):
Depressing to be.
Speaker 1 (06:39):
Like now I'm forty and I'm even more worried about
it than before.
Speaker 3 (06:45):
The twenty sixty US presidential debates made Caylen feel particularly
concerned about our future.
Speaker 1 (06:51):
I mean, I still remember in one of the Trump
Clinton debates, Hillary Clinton saying something like the Russian government
is making these claims about me, and kind of jumping
back up, like.
Speaker 2 (07:05):
The Russian government, what are we talking about? You know.
Speaker 1 (07:08):
At that point, I at least was just not aware
of the way that big forces were starting to use
social media to try to control or shape political events
and outcomes. My collaborator James weather Or and I both
started thinking like, oh, this is really going to matter,
this is going to be important. But one thing that's
kind of funny is that we we even writing this
(07:31):
book on misinformation.
Speaker 2 (07:32):
So we wrote this book The Misinformation Age.
Speaker 1 (07:34):
Starting then, we really underestimated how serious of an issue
it would be, and how long term, So we thought
we have to write this book really fast.
Speaker 3 (07:44):
The book for Misinformation Age, How False Beliefs Spread is
an impressive and stellar contribution to the field. It's totally
digestible for the general public, and it digs into how
people understand what is truthful and what is not, and
ultimately how misinformation has become such a huge issue in
(08:05):
our digital world. So getting into your work, you call
yourself a philosopher of science. Now, if a word philosopher
to me makes me think of Confucius or Aristotle, and
people whose feels to study definitely do not overlap with
mine at all. So what does it mean to be
a philosopher of science?
Speaker 1 (08:24):
Well, first of all, I'll just point out all the
sciences used to be philosophers. I mean, Aristotle wrote on
all sorts of things biology and physics and astronomy, you
know everything, and then the science has kind of peeled
away and philosophy was what was left. But philosophy of
science is an old discipline and there are a lot
(08:45):
of things philosophers of science do. So one thing is
work on understanding how science as an enterprise works.
Speaker 2 (08:52):
You know, how do scientific progress work?
Speaker 3 (08:55):
For thousands of years, philosophers have asked questions about the
way we live and why, about how we work together,
and why societies work the way they do.
Speaker 1 (09:05):
One thing that's a very traditional question in philosophy is
how does knowledge work?
Speaker 2 (09:11):
How do we come to know anything?
Speaker 1 (09:13):
And also what does it mean to have a belief
in something that's justified. One thing that a lot of
philosophers work on now is what's called social epistemology.
Speaker 3 (09:24):
That's the study of how people understand knowledge, how they
understand what is true, and how we search for truth.
Speaker 1 (09:30):
So Descartes was focused on these really these questions about
individual knowledge, like how can I, as this one little
isolated unit.
Speaker 2 (09:39):
Have confidence that the things I believe are true?
Speaker 1 (09:42):
But increasingly people came to understand that that's not really
how human knowledge works. So other people tell us things,
we choose to trust them, We uptake those things as beliefs.
We do some other things we reason about, like is
this consistent with the other things I believe? Should I
trust this person for what reason?
Speaker 2 (10:01):
Or should I not?
Speaker 1 (10:02):
Our knowledge, in fact, is just very very deeply social.
So one thing a lot of philosophers study is social
epistemology that relates to philosophy of science, because a lot
of things in philosophy of science or work on how
groups of scientists come to know or believe things.
Speaker 2 (10:18):
So how does a group.
Speaker 1 (10:21):
Of humans who are interested in climate change come to
believe the climate is warming and at these rates and
as the result of these causes. So those are things
that philosophers work on.
Speaker 3 (10:36):
So with fake news, then it's an element of actually,
you've got a belief that's not true that is being
presented effectively as a truth.
Speaker 1 (10:49):
So, first of all, I don't necessarily love to use
the term fake news, because, as a lot of people
have pointed out, fake news was this term that was
like very much applied to this specific phenomena happening in
twenty sixteen, twenty seventeen, twenty eighteen, when people would make
fake news sites and fake news articles, and then the
(11:10):
term got co opted by the right to describe a
lot of true things as fake when they weren't.
Speaker 6 (11:16):
They are the enemy of the people, and we could
have a country that would be able to heal and
get together, except the media fomens it. They're so corrupt,
and you know, I call it. I came up with
the term fake news a long time ago. I don't
know if I'll get credit for that, but that's okay.
Speaker 3 (11:33):
As Kaylin said, fake news has taken on a new
meaning for a lot of people. It just means anything
I don't like, So I.
Speaker 1 (11:42):
Often will use misinformation or misleading content. So that's why
I'm like switching language a little bit. So there are
a lot of ways that people spread misinformation or misleading content,
and a lot of reasons why people like uptake it.
The most basic just has to do with this fact
(12:04):
that we're social learners, where we necessarily have to trust
each other in order to learn most things we know
about the world, which means that people tell us things.
You know, most of the time it's in our best
interest to believe those things, and sometimes it's not. Sometimes
those things are wrong, but we just don't have the
(12:24):
ability to always differentiate between the stuff we're getting from
other people that's true and that's false. And so social
knowledge is tremendously powerful. But once you open up that channel,
this door for social information to come through, you're going
to open up a channel for misinformation for fake news
(12:45):
to come through too.
Speaker 3 (12:46):
That's a really powerful analogy. I really like that, Kayley,
so I have to ask. I mean, I'm very aware
that I have done this. Have you ever fallen for
any news articles that are basically misinformed?
Speaker 1 (13:00):
Oh?
Speaker 2 (13:00):
One hundred times.
Speaker 3 (13:01):
Yeah, I think people would find that reas.
Speaker 1 (13:05):
I mean, it was stressful to write a book about
misinformation because I guarantee somewhere in that book we say
something false, probably multiple things false, even though we did
the best research we could. But yeah, I'm a human,
and like all other humans, I often fall for misinformation.
One of my favorite little examples is I was doing
(13:26):
an interview on misinformation and someone brought up an example
of like a propagated false belief, which is that Daddy
long legs these spiders in the US are the most
poisonous spiders in the world, but their mouth they're too
small to bite you.
Speaker 2 (13:41):
And as he was saying it, I was like, yep,
I believed that one.
Speaker 3 (13:43):
Until I've not heard that one. Actually, that's amazing. It's
a crowd of story.
Speaker 2 (13:52):
As soon as he said it, I was like, oh, yeah,
that's very dumb. Of course that's false, but no, I
hadn't believed it until that.
Speaker 3 (14:00):
I've done it too. A few years ago, there was
this news article about a polar bear who had become
stranded on a Scottish island after the melting Arctic ice.
Ha'd separated the poor animal from its home. And I
remember reading this piece and going, oh, that's amazing. But
of course this is just an April fool's prank have
(14:21):
a newspaper, even I, as someone who studs as a
climate and melting ice fell for it. Now, that kind
of misinformation is fairly harmless, both to us and of
the Daddy long legs and polar bears not in Scotland.
But other misinformation is developed by bad actors to influence politics,
the economy, and the very social fabric of our communities.
Speaker 4 (14:47):
We're a fucking the future. We're a fucking the future.
Speaker 3 (15:00):
One of the most interesting, or perhaps most terrifying things
about the miss and disinformation landscape today is how oil
and gas are using environmental and nature activists to spew falsehoods.
Speaker 1 (15:12):
There's been a lot of stuff spread by oil and
gas about how windmills kill birds and how windmills harm whales.
So here are people trying to glom on to people's
environmental impulses, their desires to help other animals protect the environment.
Speaker 2 (15:31):
But what they're.
Speaker 1 (15:31):
Actually trying to do is to stop action our.
Speaker 3 (15:35):
Climate change, really pressing those emotional buttons. Now, what if
you could help clarify for me and the listeners as well,
what is the difference between misinformation and disinformation?
Speaker 1 (15:48):
So the way people typically pull those apart is to
say that disinformation is false and it's intended to mislead,
it's misleading, and someone's trying to mislead you, whereas misinformation
is misleading but there's no intention to.
Speaker 2 (16:07):
Lead you astray.
Speaker 5 (16:09):
Ah.
Speaker 3 (16:09):
So some of the times when we're talking about these
climate issues which are being deliberately misleading, are actually effectively disinformation.
Is that right?
Speaker 1 (16:20):
Yes? Though I think that when we talk about information
on social media, it's not like this is a bad
way to differentiate things, but it ends up not, I think,
always capturing what's happening on social media, because a lot
of what you see is disinformation created by cynical parties
(16:41):
that then becomes misinformation.
Speaker 3 (16:42):
Ooh, so that they are actually sharing it not with
the intent to mislead, they think it's true exactly.
Speaker 1 (16:50):
So most of the people spreading, for example, say of
the whales, are not going to be people who are
trying to mislead to anyone or not going to be
people who want a bad outcome. Environmentalism so in that
it sort of transforms into misinformation, and it's designed to
transform into misinformation.
Speaker 3 (17:10):
In Kalin's book The Age of Misinformation, she and are
co author James Owen revel right about Roger Ravel, a
climate scientist who was one of the first people to
study global heating.
Speaker 1 (17:22):
He was a major influence on al Gore, who of
course has been a climate activist. As a politician, he
did a lot of work showing that carbon diaxterrad was
increasing in the atmosphere as a result of fossil fuels.
So he really spent his career showing that the climate
was changing and in fact raising alarms about global warming.
Speaker 3 (17:45):
By the time Jim Hansen gave his testimony in front
of Congress in nineteen eighty eight, Ravel was already seventy nine,
and as he aged he became very sick. Now during
his time, a man named Fred Singer came along, took
money from oil and gas. He was basically paid to
sow skepticism and doubt about climate change. Fred Singer has
(18:07):
a long resume to name some of his greatest hits.
In the nineteen eighties, he helped promote confusion about the
causes of acid rain, the health effects of smoking, and
ye ozone hole depletion. And in nineteen ninety one, Fred
took some of his previous writing, repurposed it into an
article and added Ravel's name as a co author. Now
(18:29):
all of this happened at a time when Revel was
gravely ill.
Speaker 2 (18:32):
The paper was skeptical about climate change.
Speaker 3 (18:36):
So you've got this subject matter expert, now in old age,
claiming that maybe the science behind global heating isn't as
solid as he previously claimed, maybe it shouldn't be believed,
Maybe there's nothing to be concerned about here.
Speaker 1 (18:53):
And so Revel's reputation as a climate scientist was weaponized
for climate skepticism.
Speaker 3 (19:01):
But people who knew Revel, including his research assistant who
had been working with him for twelve years at this point,
said Revel had been hoodwinked into attaching his name to
the paper.
Speaker 1 (19:12):
A lot of people cast doubts about whether Revel was
really able to consent properly to being an author on
this paper.
Speaker 3 (19:19):
And the paper had serious repercussions on the public debate
around climate change. For example, the then Senator al Gore
had been talking about the greenhouse effect and climate change
for many years, but now the top scientist on climate
change was apparently doubting his own work. It cast doubts
on the whole thing.
Speaker 2 (19:39):
This was made to use to make Al Gore look foolish.
Speaker 1 (19:43):
You know, the person who you told us had showed
all these things about how the climate is warming, even
he doesn't actually think it's warming. This is something we've
seen happen again and again in the climate skepticism movement.
Speaker 2 (19:58):
Is that, you know, oil and gas.
Speaker 1 (20:00):
The people sort of working to fight understanding on climate change.
They get real scientists to take up climate skeptic positions. However,
these scientists are very rarely like climate scientists.
Speaker 2 (20:13):
In fact, it's almost always physicists.
Speaker 1 (20:15):
I don't know what is wrong with physicists producing these
people who are willing to do this, But so they'll
get these very prominent physicists to sort of be the
face of climate skepticism. And because these are actual scientific
experts and people usually trust scientific experts, their skepticism ends
(20:36):
up looking very powerful.
Speaker 3 (20:39):
Fred Singer was repeatedly interviewed about the paper Have the
Huge Change in Revel's view on global heating in the interviews,
he just outright lies about the situation.
Speaker 7 (20:50):
When he joins US Live from Washington, DC, Doctor Singer,
what was Roger Ruvel's view of carbon dioxide as a
greenhouse gas when you co author that caused most article
back in nineteen ninety, he was very relaxed about it.
He basically.
Speaker 4 (21:08):
Looked at this as a grand geophysical experiment.
Speaker 3 (21:12):
And this, of course wasn't the first time something like
this happened. The Roger level case is just one of
many alarming examples of how a big oil lobby uses
real scientific experts to amplify their quack science.
Speaker 1 (21:27):
I mean, there have been others, so I mean, this
was a strategy that was really engineered by big tobacco
in the fifties and sixties. But for example, they created
a group called the Tobacco Industry Research Committee that was
supposedly going to research whether tobacco was harmful. It was
in fact a propaganda body. But it was headed by
(21:49):
a molecular biologist who didn't like genetics, and so he
again was a scientific expert. He of course was in
no way like a health expert or a doctor or
a medical researcher, but he lent the sort of way
of expertise to this group.
Speaker 3 (22:07):
So when we think about misinformation, we think about Russian
state television, but one of the things you write about
is actually way way scarier. And it's our idea that
these days the propagandis are family and friends, and that's
because of social media. I wonder if you could take
us way back to twenty sixteen and explain how social
(22:28):
media really changed the state of misinformation. Yeah.
Speaker 1 (22:32):
So social media, we can think of it broadly as
having changed the way information can flow between people. It's
kind of special in that it changes very quickly, and
there's always new platforms, and the way information is flowing
is always changing. First it soundbites, and then it's pictures,
and then it's words, and then it's words and pictures,
(22:54):
and then it's videos, and so the change is so
fast that it's sort of hard to respond regulate. Here's
a few of the things that really matter about social
media and why misinformation can spread so well on there.
So one thing is that people can get big platforms
even when they don't really deserve big platforms. Another thing
(23:15):
is that it's hard to know the source of information.
So if you're thinking about like person to person social
exchange of information, you're looking at another person in the face.
You can see who they are, you know where they live,
you know how you met them, You probably know other
people they're connected to. You have a lot of information
about who they are. Once you get onto social media,
you're looking at profiles where there may or may not
(23:38):
even be a real person corresponding with that profile. That
profile could be bought, it could be a Russian agent,
it could be another political agent, it could be someone's
secret earner account where they're trying to do something, and
so you have less ability to judge your social sources
as trustworthy or not trustworthy. In addition, we see propagandists
(24:00):
able to take advantage of various aspects of social media
to sort of tweak our social feelings of trust in
ways that are much harder to do person to person. So,
for example, they can get a bunch of bots to
like a piece of misinformation, and then that looks to
us like this is very popular.
Speaker 2 (24:21):
A lot of people like it, a lot of people
trust it.
Speaker 1 (24:23):
That's a cue to us that it is trustworthy that
we we ought to engage in, or that we could share,
and so there are ways for our you know, our
normal knowledge forming mechanisms to get hacked.
Speaker 3 (24:42):
And Kalin says that right now a lot of people
don't trust the institutions that we've historically gained knowledge from,
like traditional media and subject matter experts.
Speaker 1 (24:54):
A lot of people argue that for the most part,
people are trusting of experts still, including of scientists, and
yet we do see this kind of phenomenon of people
describing the New York Times as fake news. A lot
of what is driving that, I think is cynical actors
(25:15):
who are trying to erode public trust in science, and
especially you see this in the US among right wing
politicians and especially populist type politicians, because of course populism
is associated with this kind of rejection of authority or expertise.
(25:36):
But there's often a reason people are doing it, which
is that if you can get people not to trust
the real scientists, not to trust the real journalists, not
to trust these good sources of information, then they're easier
to control.
Speaker 3 (25:49):
Just backing up, we've got bad actors who are trying
to influence public opinion so that those people don't say,
stop protesting in the streets about the government's inact on
the climate crisis, and these bad actors are putting a
share ton of money into advertising and propaganda campaigns, but
they're also influencing our politics through lobbying and funding the
(26:12):
campaigns of politicians who side with their bogus narrative. This
might seem pretty bleak that there is a solution.
Speaker 1 (26:23):
Thing.
Speaker 2 (26:23):
Number One, campaign finance reform.
Speaker 1 (26:26):
That's a depressing topic because the people who are financed
are the ones who have to implement campaign finance reform.
But without it, it's pretty hard to see how we'll
get effective climate action because there are just so many
politicians who are funded by oil gas call these massively
(26:48):
wealthy corporations that can afford to give a lot of
money to politicians. Another thing has to do with regulation
of online content, so free speech.
Speaker 2 (27:05):
We don't want to impinge on free speech.
Speaker 1 (27:08):
We do want it to be the case that all
of us can be in informational environments that allow us
to form good beliefs, that give us the freedom to
think effectively about what's happening around us, to learn about
the world, that give us the freedom to make good
choices for our own lives and protect our own interests.
So you know, there are ways in which we can
(27:32):
think about us as having rights to be in good
informational environments as well as other people having rights to
share bad information.
Speaker 2 (27:40):
One thing a lot of philosophers have talked about.
Speaker 1 (27:42):
Is the difference between stopping speech and deplatforming. So we
don't think of people as having a right to have
any platform for their false beliefs or bad ideas, Like
nobody is entitled to come to a university and get
of talks on their flat earth theories.
Speaker 2 (28:03):
In the same way people.
Speaker 1 (28:04):
Aren't entitled to have the algorithm on Instagram promote their
content for them.
Speaker 2 (28:10):
To a lot of people, that's not an entitlement.
Speaker 1 (28:14):
So we can have regulations where people can say what
they want, but where we don't have to platform spread
what they're seeing. And you know, it's not as if
people have an entitlement to even be on social media
platform that's up to social media platforms, and if you know,
(28:37):
they were to take that more seriously, I would think
a good model is something like, when you get onto
a platform like this, you sign a user agreement or
a contract that says if I promote too much misleading
or harmful.
Speaker 2 (28:50):
Content, then the platform can.
Speaker 1 (28:52):
Kick me off because the platform has a standard for
the kind of content you can share.
Speaker 2 (28:57):
Of course, that gets into very difficult about who's.
Speaker 1 (29:00):
Going to decide what kind of content is misleading and harmful.
Speaker 2 (29:03):
And that is legitimately tricky, but sort of in.
Speaker 1 (29:07):
The extremes, there's a lot of content that we can
you know, it's just not controversial that it's misleading, that
it's interfering with people's abilities to function and make decisions,
and that's the kind of stuff that could be a
gimme to say if people are sharing too much of that,
then they don't get platformed on this.
Speaker 2 (29:23):
I think the.
Speaker 1 (29:24):
Sort of most promising model for how to regulate niche
online is to have something analogous like to the EPA
or the FDA, a regulatory agency where we're not thinking
about like passing laws in Congress that say this is
what you can do on Instagram, but rather we have
(29:47):
a set of guidelines that social media platforms have to
comply to, and then these regulatory agencies can work flexibly
with those media companies to comply with those guidelines.
Speaker 3 (29:58):
Okay, that's big honors, maybe not up your alley, but
there are a ton of ways we can protect ourselves
from missing disinformation.
Speaker 4 (30:07):
We're fucking the future, We're un fucking the future.
Speaker 3 (30:19):
If you like me, you might feel totally equipped to
recognize miss and disinformation, and yet most of us have
fallen for it. I definitely have. So what can we
actually do here?
Speaker 4 (30:33):
What fuck can I do?
Speaker 3 (30:37):
I'm joined again by the brilliant Maggie bed Maggie. How's
it going, Hey.
Speaker 2 (30:42):
Chris, I'm doing really well.
Speaker 8 (30:44):
But honestly, some of the steph you and kill just
talked about it's pretty fucking depressing. I mean, even those
of us who think of ourselves as savvy can be
easily duped by misinformation.
Speaker 5 (30:56):
Because the people who put that shit out do a
really good job of it.
Speaker 9 (31:02):
And some of it is disinformation. They really are trying
to make us believe something that's not true. I don't
know about you, but I think of times when I've
been fooled but like a visual image and I believe
it because I see it with my own eyes and
then I realized it was manipulated.
Speaker 2 (31:17):
Well, that's the same thing that happens with facts.
Speaker 3 (31:20):
Totally me too, I mean, Kaitlin had one tip I
wanted to share about how we can steal clear misinformation.
I mean, she says one of the easiest ways to
spot misinformation relate to the climate change is to look
out for news of articles where climate disansers are being
blamed on something other than the climate crisis.
Speaker 1 (31:37):
When you see these kind of social consequences of climate change, like,
for example, conflicts related to climate crises or refugees, lines
that are casting doubt about the real causes, like oh,
this would have happened even without rising heat, or it
wasn't actually the climate that caused this, lines about I'm
(32:00):
meant being a conspiracy. And then I talked about these
kinds of distracting claims where they're talking about harms from
climate mitigation or green energy, and those harms might be real,
but they're distracting from the much much more serious harms
of continuing to use fossil fuels.
Speaker 5 (32:20):
Oh, I think that is such a good point. Misinformation
isn't just incorrect information. It's also information that doesn't include
the full picture. Maybe it's facts cherry picked from a
larger study that paint a picture that is very misleading.
So if you're wondering if what you're reading is misinformation
(32:44):
or disinformation. Here are some red flags to look out for.
First of all, I am very skeptical of emotional reactions.
Content that uses really emotional or charge language, it just
should be checked. You just want to make sure all
the facts are straight before retweeting or sharing it. It's
(33:06):
so tempting because you're like, oh my gosh, that's shocking.
I'm gonna share it, and then you know, just take
a breath, check it out. And when something sounds too
good or too bad, or or maybe too shocking to
be true, well it just might not be true. Also,
always make sure you check the source of the information.
(33:29):
Who funded the study that is being cited. Is it
a reputable academic resource or a corporation that stands to
gain financially? And if it's the latter, maybe take that
information with a grain of salt.
Speaker 3 (33:44):
Oh, that last one is good. Always check out the
source of the information and that you're conveying the full picture.
And that's what the fuck you can do?
Speaker 4 (33:54):
What the fuck can I do? Oh?
Speaker 1 (34:00):
Fucked?
Speaker 3 (34:05):
That's all for this episode. Next time, I'm fucking the
future Bill neither science guy.
Speaker 2 (34:12):
A question I have for conservative me is why are
you doing this? Why are you ignoring the facts. What
is it and to think, well, they're doing it for
the money.
Speaker 1 (34:21):
What money?
Speaker 2 (34:23):
We're all gonna die if you keep this up.
Speaker 3 (34:25):
I think you're really going to enjoy it until then.
I'm Chris Turney signing off from Sydney, Australia. Thanks for
joining me in I'm Fucking the Future.
Speaker 4 (34:36):
Weird Fucking the Future.
Speaker 3 (34:44):
I'm Fucking the Future is produced by Imagine Audio and
Awfully Nice for iHeart Podcasts and hosted by me Chris Turney.
The show is written by Meredith Brian. I'm Fucking the
Future is produced by Amber von Shassen and Renee Colvert.
Ron Howard, Brian, Carral Welker and Nathan Chloke are the
executive producers from Imagine Audio. Jesse Burton and Katie Hodges
(35:06):
are the executive producers from Awfully Nice. Sound design and
mixing by Evan Arnette, original music by Lillly Hayden and
producing services by Peter McGuigan. Sam Swinnerton wrote our theme
and all those fun jingles. If you enjoyed this episode,
be sure to rate and review Unfucking the Future on
Apple Podcasts or whether you get your podcasts