Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Listeners know that I am a science nerd, a data nerd.
We talk frequently about data abuse, and of course the
old Mark Twain and or Benjamin Disraeli. It's unclear who
said it thing about there are three kinds of lies, lies,
damned lies, and statistics, and we like to get things
(00:22):
right on this show. Speaking of nerd, I am so
pleased to welcome Roger Pilka Junior back to the show.
Roger is a professor at University of Colorado, not old
enough to be professor emeritus, and yet he soon will be,
which makes me think that Roger is his dad.
Speaker 2 (00:41):
Hi, Roger, Hi, I'm good. I'm good. Roger.
Speaker 1 (00:48):
Roger studies and teaches what is essentially the intersection of
science and public policy, which which to me, although I've
been teasing Roger here a little bit this morning, is
among the most interesting things and most important things a
person could study. And his He's got a fantastic substack
called the Honest Broker. And if you just go type
(01:10):
that in the Honest Broker substack or Roger pilk substack,
you will find it. Roger posted a piece yesterday entitled
the top five climate science scandals, and these things will
either make you laugh or cry or scream or some combination.
Speaker 2 (01:24):
So I just wanted to spend a little time going
through them with you.
Speaker 1 (01:27):
Roger, all right, sounds great, So why don't we just
go through them in the same order that you that
you did him in this great piece number five? The
interns made a data set and we used it for
our research.
Speaker 3 (01:42):
Yeah, so this has to do with historical hurricane costs.
Speaker 4 (01:46):
Some research out there I've been involved in seeks to
look at storms from deep in history and say, well,
how much damage would they cause today? Long story short,
a group of scholars in Denmark found a data Excel
data set on the internet. It was a marketing data
set for a now defunct insurance company, and they published
a paper with it and it said things are getting worse.
(02:09):
I called this to the attention of the journal one
of the big major journals, Proceedings of the National Academy
of Sciences, and they refused to look at it.
Speaker 3 (02:16):
So that's one where.
Speaker 4 (02:18):
The self correcting function of scientific publishing completely failed.
Speaker 1 (02:23):
So, just to be clear, are you saying that the
data in this data set is entirely fictional, is just
made up for some purpose?
Speaker 2 (02:33):
Or is there some other issue with the data.
Speaker 3 (02:37):
The data started out and this is how I know it.
Speaker 4 (02:39):
It started out from one of our papers, and then
over years the company hired interns and they added more
data to it, and they stapled on another data set
to the end, and it went from a legitimate scientific
data set to something that had no relationship to science.
It was for marketing, and so you know, buyer beware
when you go out, you find something online.
Speaker 1 (03:02):
And one of the most prestigious scientific journals in the
world allowed a paper in that used the fake data
set and then ignored you when you told them.
Speaker 2 (03:11):
This data set isn't real.
Speaker 4 (03:15):
Yes, yeah, I mean this paper since it did have
kind of splashy findings. You know, hurricanes are getting worse,
has been promoted in climate advocacy. So if the journal
were to retract the paper as they should have, it
would be a domino that caused a whole bunch of
other dominoes to fall.
Speaker 3 (03:32):
So, and it's not unique that journals.
Speaker 4 (03:35):
Are reluctant to retract papers because they look bad for
publishing bad science in the first place. But the only
way science works is if we recognize when things go
wrong and correct them and move forward.
Speaker 1 (03:46):
Before we get to the next scandal, please give me
a one or two sentence high level summary for the
laymen of the intersection between quote unquote climate change and hurricanes.
Speaker 3 (04:05):
So, climate change is real, climate change is serious.
Speaker 4 (04:08):
However, the inner Governmental Panel on Climate Change has not
detected changes in hurricane behavior. And that's in the Atlantic
and globally over the past century. And that's just where
the science is right now.
Speaker 2 (04:21):
Okay, and just quick follow up on that.
Speaker 1 (04:23):
So then would it be correct to say that when
you hear these reports of massively more damage that really
what they're measuring is that more people have built homes
in the path of hurricanes, and homes are more expensive
than they used to be.
Speaker 4 (04:41):
Absolutely of the increasing damage from hurricanes is due to
more people, more property, more wealth in harms.
Speaker 1 (04:49):
Way number four, the Alamante retraction for an unpopular view.
The Alamanti retraction sounds like the title of a Robert
Ludlum novel.
Speaker 2 (04:57):
What is it?
Speaker 3 (05:00):
So?
Speaker 4 (05:00):
This is a paper that was published by a group
of Italian scientists basically summarizing what the Intergovernmental Panel on
Climate Change IPCC says about extreme events. Most people are
surprised to learn that with a few exceptions. Extreme heat
waves and extreme precipitation are the exceptions, but floods, droughts,
(05:21):
hurricanes have not been associated with human cost climate change.
So they write a paper summarized it. The Guardian newspaper
in the UK contacted a few scientists who said some
mean things about it, and they demanded that this paper
be retracted. Retraction happens when there's bad science. There was
(05:41):
no bad science in this paper. It was just inconvenient
and the journal, again another top journal, retracted the paper
and took it out of the literature, even though there
has been no claim that there's anything wrong with it.
Speaker 3 (05:53):
It just had the wrong message.
Speaker 1 (05:55):
Okay, So the first thing we talked about was a
journal not retracting a paper that had false data in it,
and the second one is a journal retracting a paper
that didn't have anything false in it because it annoyed somebody.
Speaker 3 (06:10):
That's exactly right.
Speaker 2 (06:12):
Oh my gosh, all right.
Speaker 1 (06:13):
Number three a major error in the IPCC. And then
you have a little quote from ipccar six, chapter eleven.
Speaker 2 (06:22):
What are we talking about here?
Speaker 4 (06:25):
So this is so the IPCC is an assessment body.
They summarize the science, and if they didn't exist, we'd
have to invent them.
Speaker 3 (06:32):
They do a really important job.
Speaker 4 (06:34):
In the first draft of their latest report, they said
it is not expected that a trend in hurricane intensity
should be detectable over the past forty years or so.
That's what they're saying, is that we don't expect that
we should see any climate change signal there By the
time it went from the first draft to what's called
the synthesis report, it was like a game of telephone
(06:55):
or Chinese whispers. It got transformed into the opposite of
what it's and it involved just a silly mistake over
confusing hurricane measurements with hurricanes themselves. Mistakes happen, they creep
in and you know what I say is, you know,
making mistake is not a crime.
Speaker 3 (07:14):
It's what you do when you identify the mistake.
Speaker 4 (07:16):
This mistake is sitting out in Plaine View, but it
gets repeated all the time, partically in the media related
to hurricanes.
Speaker 1 (07:23):
I think you and I talked about this once before.
So I just want to make sure folks understand what
we're talking about here. So instead of going out to
measure hurricane intensity, let's say fifty times, I'm just making
enough making up a number fifty times over the course
of a hurricane season. People went out to measure hurricane
intensities one hundred times over the hurricane season, and somehow
(07:48):
the fact that they went to measure more often got
translated into a claim that there are either more hurricanes
or stronger hurricanes or both, which was actually was not
the intention of the people who were going to measure more.
Speaker 2 (08:05):
They were just going to measure more. Is that about, right?
Speaker 3 (08:09):
Yeah, that's that's basically it. That's basically it.
Speaker 4 (08:11):
It's it's a terminology issue that got mistaken as as
hurricanes not measurements, and over successive versions of the report,
it just got pushed on and then it became a
top line finding.
Speaker 1 (08:24):
Okay, so let me follow up on this. Why is
that important?
Speaker 4 (08:30):
Well, so everyone likes to look at extreme events as
in an indicator of climate change.
Speaker 3 (08:35):
It's it's it's an advocacy.
Speaker 4 (08:37):
Every one of us knows that the you know, the
thing that just happened, the hailstorm, the flood, the hurricanes
are the most popular. People like to associate them with
human cause climate change. However, the full IPCC report doesn't
support most of those connections. So this mistake is actually
contrary to the science in the report, and it should
(08:59):
be fixed because you know, people go to the top lines,
they don't go into the ten thousand page report to
see if the science really says. So it's it's enormously
frustrating because it's the opposite of what the scientific literature says.
Speaker 1 (09:12):
All right, So the next climate scandal relates very closely
to what you were just saying, and number two billion
dollar disasters as the best indicator of climate change.
Speaker 2 (09:22):
Why is this a scandal? What's the scandal?
Speaker 3 (09:25):
So now we're getting into the big time.
Speaker 4 (09:28):
So most people have probably heard of the tabulation of
billion dollar disasters how many there are every year? The
National Oceanic and Atmospheric Administration NOAH.
Speaker 3 (09:38):
It's a great agency.
Speaker 4 (09:40):
But about twenty years ago or so, they started promoting
counts of billion dollar disasters like a listical, very media friendly,
very simple to understand, and it's unfortunately the methods behind
constructing this are hidden in secret.
Speaker 3 (09:58):
I located a secret rectory.
Speaker 4 (10:01):
On the NOAH website, unmarked, unlabeled, thanks to a whistleblower,
and found, you know, all sorts of shenanigans going on
in how.
Speaker 3 (10:08):
This data set is put together.
Speaker 4 (10:11):
This data set has been quoted by the president, by
President Biden. It's uh, you can find it mentioned in Congress,
and it's going. It's gone from something that might have
been clever marketing for the agency to something that people
think is scientific, but there's not much science behind it.
Speaker 1 (10:28):
It's it's a remarkable thing to have, especially a government agency,
you know, tricking people, and do you want to offer
an opinion as to why they're doing I mean, you've
written about this, and I read everything you write. You
you so not only have you written about it, but
you've written that you told them that you made sure
(10:48):
they know what you're writing. And despite that, Yeah, that's right,
So I go ahead, go ahead.
Speaker 4 (10:55):
Yeah, I filed a scientific integrity complaint with Noah. I'm
actually we're supposed to hear back in the next two
days about this data set.
Speaker 3 (11:03):
You know.
Speaker 4 (11:03):
I think it's one of those things everyone knows, like NASA,
they have pictures of Mars and spaceships and their own
TV channel. They're really good at marketing. Noah, they're the
National Weather Service. They're not as good as NASA. So
I think once the president is talking about your agency
and a quote unquote data set that you have on
your website. It becomes very tricky to disentangle yourself from that,
(11:25):
even if there are flaws in it. I don't think
this is, you know, climate politics. I don't think it's
left right politics. I think it's more agency politics and
the seduction of having a bazillion clicks on a website.
Speaker 1 (11:39):
You have, All right, and number one on your list
of climate scandals is a thing that you've been writing
about a lot, and you entitle this version of it
a love affair.
Speaker 2 (11:51):
With extreme emissions scenarios.
Speaker 4 (11:53):
Please explain, Yeah, and this one, this is head and
shoulders above all the rest. This is This is a
huge problem for the climate science community. Climate sciences project
the long term future of the climate. They have to
start with, well, how much carbon dioxide are we going
to emit? And that requires knowing well, what energy sources
(12:14):
are we going to have? Long story short, you know,
about three decades ago there was an assumption made that
all global energy.
Speaker 3 (12:22):
Was going to come from coal.
Speaker 4 (12:24):
It was going to increase by ten times by the
end of the century, so we would have massive, massive emissions.
When you plug massive emissions into climate models, you get
massive climate change.
Speaker 3 (12:35):
It turns out the world is actually.
Speaker 4 (12:38):
Not going to increase its coal consumption by a factor
of ten. In fact, the International Energy Agency projects peak
coal even though China and India are still burning.
Speaker 3 (12:46):
It peak coal within a decade.
Speaker 4 (12:49):
So the future there will be climate change, but it's
not nearly as severe as the extreme emission scenarios would
have you think. The climate community knows this, and it's
known about it for a long time, thanks to one
of my colleagues named Justin Ritchie. But for whatever reason,
it has had a very difficult time reorienting towards the
(13:11):
more moderate, less extreme scenarios.
Speaker 3 (13:14):
And that's a big problem.
Speaker 1 (13:16):
Is this because there's no grants from the MacArthur Foundation
or the federal government if you say things are basically
going to be all right?
Speaker 4 (13:27):
You know, I think in our paper on this, we
see it as a multi pronged causality. Part of it is,
you know, you get in the New York Times, in
the Wall Street Journal if you have real dramatic, scary papers.
Speaker 3 (13:40):
University press offices like.
Speaker 4 (13:42):
To put out press releases that show impactful, significant findings.
Advocates love extreme scenarios because they make the future look
look more severe than it probably is, and there's also
legitimate scientific reasons to use extreme scenarios to test models.
It may not say anything about the real world, but
(14:02):
it may tell you something Scientifically. All of those factors
mix up together, and it doesn't matter if it's climate
change or another topic.
Speaker 3 (14:11):
People love the extreme.
Speaker 4 (14:13):
Extreme stories, extreme scenarios, and climate change is no different.
It's just that they've had an incredibly difficult time letting
go of this and moving on in the direction where
we know the science is more accurate.
Speaker 2 (14:25):
Yeah.
Speaker 1 (14:25):
I'm reading an economics book and I'm going to talk
with the author at some point soon. And one of
the things he warns people against is is taking the
black swan events, the extremely unlikely black swan events, and
making them the center of your probability distribution rights. And
that's kind of what's what's happening here, that folks are
now taking the extreme climate change scenario so extreme as
(14:50):
to be essentially impossible and using them as the basis
for papers where they talk about it as if it's
what's reasonably likely to happen.
Speaker 2 (14:58):
Okay, I want to switch gears with you. I have
two minutes, Roger.
Speaker 1 (15:01):
You you wrote a piece a couple of days ago,
the Olympic War over women, and this is another area
of study for you, sports and sports policy and so on,
and and I think I was one of the few
people to kind of sort of get it right on
these two Olympic female boxers.
Speaker 2 (15:18):
And what I said was, and I.
Speaker 1 (15:21):
Just want you to address this quickly and tell me
whether you think I'm mostly right, mostly wrong whatever. I
said that First, this is not the kind of transgender
situation that we've been talking about with you know, biological
men playing in women's sports.
Speaker 2 (15:33):
That is not what this is. These are These are apparently.
Speaker 1 (15:38):
Uh, physically women and biologically to some significant degree women,
but also apparently with some sex characteristics of men x
Y chromosomes and likely produce much higher levels of testosterone
than you know, typical women do. I want to be
careful how I talk about I'm not trying to insult anybody.
Speaker 2 (15:59):
It's just a difference situation biologically.
Speaker 1 (16:01):
In my opinion, based on what I know, I think
they have what I would call an unfair physical advantage.
Maybe not quite to the degree that a full trans
athlete would, but there's there's is something there.
Speaker 2 (16:15):
How do you think we should think about this?
Speaker 4 (16:19):
Yeah, so you know, I largely agree there that these
women were women girls from birth continuously all the way
to adulthood.
Speaker 3 (16:29):
So that's not really in question.
Speaker 4 (16:30):
We don't, in fact know their biology or medical history,
and you know, ethically, nor should we.
Speaker 3 (16:36):
It's none of our business.
Speaker 4 (16:37):
I wouldn't want my medical history out in public, and
neither would you, neither would anybody else. The issue here
isn't who they are, you put your finger on it.
The issue here is do they have an unfair performance
advantage against other women?
Speaker 3 (16:51):
And the good news is.
Speaker 4 (16:52):
We can use scientific tests and research to identify what
an unfair advantage is. We do that with athletes to
run on cheetah blades who are amputees, and we study
them and sometimes they're allowed to participate in uh in
mainline sport, and sometimes they're not. And what determines that
is whether they have an unfair advantage. So it's not
(17:13):
enough to say, well, here's your biology, you're a woman,
but there you're you're a super woman.
Speaker 3 (17:19):
You have to have evidence to make this case.
Speaker 4 (17:22):
And unfortunately this gets all wrapped up in the politics
of gender, which is, you know, really nasty and yeah,
but there is a simple, straightforward way to deal with this,
and that's.
Speaker 3 (17:30):
That's with evidence and science.
Speaker 1 (17:31):
So just real quick, but do you do you think
that the evidence in science that you're describing could be
as simple as measuring testosterone levels or is it much
more complex than that.
Speaker 3 (17:44):
It's much more complex than that.
Speaker 4 (17:46):
We actually did a study and I testified before the
Court of Arbitration for Sport.
Speaker 3 (17:50):
On this topic.
Speaker 4 (17:52):
You can't go to the Olympics and take people's testosterone
levels and figure out who's going to be on the
metal stand.
Speaker 3 (17:59):
It doesn't work like that.
Speaker 4 (18:01):
In general, typical men have higher testosterone than typical women.
But when you get to you know, variations of biology
and complex what are called differences of sex development things,
there's a lot more overlaps.
Speaker 3 (18:14):
Than people would assume.
Speaker 4 (18:15):
So it is it is super complicated biologically, which is
why I tell people focus on performance because performance is objective.
We can measure it and we don't have to get into,
you know, trying to define people talk about how they
look and things like that.
Speaker 1 (18:29):
I think listeners can understand why said when we began
this interview that Roger Pilka's area of study is one
of the most interesting areas of study you can imagine
the intersection of science and policy. Roger Pilka is a
professor at the University of Colorado. For a little while longer,
his substack is called The Honest Broker. I encourage everybody
to go subscribe. It doesn't cost you anything, and it's
(18:51):
worth a lot more than that. Roger, thanks so much
for your time. As always, tremendous conversation.
Speaker 3 (18:57):
Thank you, Ross.
Speaker 2 (18:58):
All right, you two talk with you again.