All Episodes

June 20, 2020 43 mins

People often ask us how we do our research. We're not going to disclose all of our secrets, but we'll give you some tips on how to root out the bad studies from the good ones. Learn all about shady studies and reporting in this classic episode!

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
M Hey everybody, it's me Josh and for this week's
s Y s K Selects, I've chosen our guide to
research tips. It's a surprisingly good episode that shares the
ins and outs of keeping from being duped online by
bad information and how to read between the lines on
sensational science reporting, all sorts of stuff like that. And

(00:22):
you might notice in this episode Chuck sounds different than usual.
That's because this is during the period that he was
transitioning into a person with a full set of teeth,
so that adds to the hilarity of the whole thing.
I hope you enjoyed this as much as we did,
making it. Welcome to Stuff You Should Know, a production

(00:42):
of My Heart Radios Have Stuff Works. Hey, and welcome
to the podcast. I'm Josh Clark with Charles W. Chuck
Bryant and Jerry This Stuff you Should Know. Um, Josh,
we're gonna do something weird today and we're gonna do

(01:03):
a listener mail at the head of the podcast, right,
all right, what, all right, let's do it. Okay, this
is from being the listener mail music going. Oh, I
don't know, should we go the whole nine yards? People
might freak out all right, this is from Bianca. Uh

(01:23):
VoiceAge is what I'm gonna say. I think that's great. Hey,
guys wrote you not too long ago asking about Hey
research your own podcast. It just got back from a
class where we talked about research misrepresentation and journal articles.
Apparently journals don't publish everything that is submitted. A lot
of researchers don't even publish their studies they don't like
the results. Some laws have been put into place to

(01:44):
prevent misrepresentation, such as researchers having to register their studies
before they get results and journals only accepting preregistered studies.
But apparently this is not happening at all, even though
it is now technically law. This ends with the general
public being missing for about methods and drugs that work.
For example, there are twenty five studies proving a drug

(02:04):
works that don't. It's more likely that twenty of the
positive results have been published and only one or two
of the negative. Uh. And that is from Bianca, and
that led us to this article on our own website.
Ten signs that that study is bogus and here it
is nice Chuck. Well, we get asked a lot about

(02:26):
research from people, usually in college are like you guys
are professional researchers. How do I know I'm doing a
good job and getting good info? And it's getting harder
and harder these days, it really is. You know. One
sign that I've learned is if you are searching about
his study and all of the hits that come back

(02:48):
are from different news organizations and they're all within like
a two three day period from a year ago, nothing like,
nothing more recent than that than somebody released the sensational
study and no one did any actual effort into investigating it,
and there was no follow up. If you dig deep enough,
somebody might have done follow up or something like that,

(03:08):
but for the most part, it was just something that
splashed across the headlines, which more often than not is
the is the case as far as science reporting goes.
So that's a bonus, that's the eleventh boom. How about that? Yeah,
should we just start banging these out, let's do it?
Or do you have some other clever well apart and

(03:30):
parcel with that. I don't know. If it's clever, you
do come across people who you know can be trusted
and relied upon to do good science reporting. So like
ed Young is one another guy named Ben gold Acre
has something called bad science. I don't remember what outlet
he's with. And then there's a guy I think scientific
American named John Horgan who's awesome. Yeah. Or some journalism

(03:53):
organizations that have been around and stood the test of
time that you know are really doing it right, like
a nature yeah, scientific amor kids are like really science. Yeah,
Like I feel I feel really good about using those sources. Yeah,
but even they can you know, there's there's something called
scientis um where there's a lot of like faith in
dogma associated with the scientific process, and you know you

(04:15):
have to root through that as well. Try it. I'm done. Uh.
The first one that they have here on the list
is that it's unrepeatable, and that's a big one. UM.
The Center for Open Science did a study, uh was
a project really where they took two hundred and seventy
researchers and they said, you know what, take these one
hundred studies that have been published already psychological studies and

(04:39):
just pour over them. And uh, in just last year.
It took him a while, took them several years. They said,
you know what, more than half of these can't even
be repeated using the same methods. They're not reproducible. Nope,
not reproducible. That's a big one. And and ones that
means that they when they carried out they followed the
methodology UM signed sific method podcasts. You should listen to

(05:01):
that one. That was a good one that they they
found that their results were just not what the what
the people published, not anywhere near them. UM. For example,
they used one as an example where a study found
that men were terrible at a determining whether a woman
was giving them um, some sort of like a clues

(05:23):
to attraction or just being friendly sexy sexy stuff or
friends or yeah or good to meet you or buzz
off jerk yeah. Um. And they did the study again
and as part of this Open Science Center for Open
Science study your survey, and they found that that was
not reproducible or that they came up with totally different results.

(05:45):
And that was just one of many. Yeah. And in
this case specifically, they looked into that study and they
found that it was UM. One was in the United Kingdom,
one was in the United States. May have something to
do with it. But the point is, Chuck, is if
you're talking about humanity, I don't think that the study
was like the American male is terrible at it. It's

(06:06):
men are terrible at it. Right. So that means that
whether it's in the UK, which is basically the US
with an accent and a penchant for t I'm just
kidding UK s soon, UM, the it should be universal. Yeah,
you know, I agreed, unless you're saying no, it's just
this only applies to American men, right, or the American

(06:28):
men right, then it's not even study. Yeah. Uh. The
next one we have is, uh, it's it's plausible, not
necessarily provable. And this is a big one because and
I think, um, we're talking about observational studies here more
than lab experiments, because with observational studies, you know, you

(06:49):
sit in a room and get asked three questions about something,
and all these people get asked the same questions, and
then they pour over the data and they draw out
their own observations. And one very famously, an observational study
that led to false results found a correlation between having
a type A personality and um, being prone to risk

(07:10):
for heart attacks and um for a long time, you
know that the news outlets were like, oh, yes, of
course that makes total sense. This study proved what we've
all known all along, um, And then it came out
that no, actually what was going on was a well
known anomaly where you have a five percent um risk

(07:33):
that chance will produce something that looks like a statistically
significant correlation when really it's just total chance. And science
is aware of this, especially with observational studies, because the
more questions you have, the more opportunity you have for
that five percent chance to create a seemingly statistically significant

(07:54):
correlation when really it's not there. It was just random chance,
where if somebody else goes back and does the same
same study, they're not going to come up with the
same results. But the if a researcher is I would guess,
willfully blind to that five percent chance, um, they will

(08:16):
go ahead and produce the study and be like, no,
it's true, here's the results right here, go ahead and
report on it and make my career. Yeah. Well, and
they also might be looking for something in affect chances
are they are, Um, it's not just some random studying.
Let's just see what we get if we ask a
bunch of weird questions. It's like, hey, we're looking to
try and prove something most likely, so that better minehoff

(08:36):
thing might come into play where you're kind of cherry
picking data. Yeah, that's a big problem that kind of
comes up. A lot of these are really kind of
interrelated to totally. The other big thing that's in related
is how the media reports on science these days. Yeah,
you know, he's a big deal. Yeah, Like John Oliver
just recently went off on this and NPR did a
thing on it, like they might even like the researcher

(09:00):
might say plausible, but it doesn't get portrayed that way
in the media. Sure. Remember that poor kid who thought
he found the ancient Mayan city. The media just took
it and ran with it. You know, Yeah, I think
there was a lot of maybe or it's possible, we
need to go check kind of thing. That mean, he's like, no,
he discovered an ancient Mayan city never known before. Yeah,

(09:21):
and let's put it in a headline. And that's I mean,
that's the That's just kind of the way it is
these days. Do you have to be able to sort
through And I guess that's what we're doing here, aren't we, Chuck,
We're telling everybody how to sort through it, or at
the very least take scientific reporting with a grain of salt, yes, right,
and not well, like you don't necessarily have the time
to go through and double that research and then check

(09:44):
on that research, and you know, right, so take it
with a grain of salt. Um unsound samples. Uh. Here
was his study that basically said, um, how you lost
your virginity is going to have a very a large
impact and play a role on how you feel about

(10:04):
sex and experienced sex for the rest of your life. Yeah,
it's possible. Sure, it seems logical, so we'll just go
with it. But when you, um, only interview college students
and uh you don't you only interview heterosexual people, then

(10:25):
you can't really say you've done a robust study, now,
can't you. Plus you also take out of the sample
size your sample population, anybody who reports having had a
violent encounter, Throw them out, that data out, because that's
not gonna inform how you feel about sex, right exactly.
You're just narrowing it down further and further and again
cherry picking the data by throwing people out of your

(10:48):
population sample that don't that will throw off the data
that you want. Yeah, and I've never heard of this
acronym weird and UM. A lot of these studies are
conducted by professors in acade make so a lot of
times you've got college students as your sample, and there's
something called weird Western educated from industrialized, rich in democratic countries. Right,

(11:10):
those are the participants in the studies studies subject. But
then they will say, men, right, well, what about the
gay man in Africa? Right, like you didn't ask him,
so that was that's actually a really really big deal. Um.
In two thousand and ten, they these three researchers did
a survey of a ton of social science and behavioral

(11:34):
science studies found that eight percent of them used weird
study participants. So basically it was college kids for eighty
percent of these papers. And they surveyed a bunch of
papers and they took it a little further and they
said that, um, people who fit into the weird category

(11:55):
only make up twelve percent of the world population, but
they represent eight percent of the population of these studies.
And a college student chuck in North America, Europe, Israel,
or Australia is four thousand times more likely to be
in a scientific study than anyone else on the planet.

(12:15):
And they're basing psychology and behavioral sciences are basing their
findings onto everybody else based on this this small tranche
of humanity, and that's a that's a big problem. That's
extremely misleading. Yeah, and it's also a little insulting because
what they're essentially saying is like, this is who matters.

(12:36):
Well also, yeah, but what's sad is this is who
I am going to go to the trouble of recruiting
for my study. It's just sheer laziness. And I'm sure
a lot of them are like, well, I don't have
the funding to do that. I guess I see that,
But at the same time, I guarantee there's a tremendous
amount of laziness involved. Yeah, or maybe if you don't

(12:58):
have the money, maybe don't do that study. Is it
that simple? I'm probably over simplifying. I don't know. I'm
sure we're going to hear from some people in academia
about this one. We'll stop using weird participants, or at
the very least say, um, like this is sexual dartmouth students. Yeah,

(13:19):
this applies to them, not everybody in the world of
these studies where they use those people as study participants
and they're not even they're not even emblematic of the
rest of the human race. Like college students are shown
to see the world differently than other people around the world.

(13:40):
So it's not like you can be like, well, it's
still works, you can still extrapolate. It's like flawed in
every way, shape and form. Probably take a break. Come, yeah,
let's take a break because you can get a little
hot under the collar. I love it. Man. Uh, we'll
be right back after this. Just god so much. Sorry,

(14:19):
all right, what's next, buddy? Uh? Very small sample sizes. Right,
if you do a study with twenty mice, then you're
not doing a good enough study. No, So they used
this um in the in the article, they use the

(14:41):
idea of ten thousand smokers and ten thousand non smokers,
and they said, okay, if you have a population sample
that size, that's not bad. It's a pretty good start.
And you find that of the smokers developed lung cancer,
but only five of non smokers did, then your study
has what's called a high power Yeah. Um, it's if

(15:02):
if you had something like ten smokers and ten non smokers,
and to the smokers developed lung cancer and one developed
lung cancer. As well, you have very little power, and
you should have very little confidence in your findings. But regardless,
it's still going to get reported if it's a sexy idea, Yeah,

(15:23):
for sure. Um. And because these are kind of overlapping
in a lot of ways, it was I want to
mention this guy, a scientist named Ulrich uh Dirnegle Uh.
He and his colleague Malcolm McCloud have been trying I mean,
and there are a lot of scientists that are trying
to clean this up because they know it's a problem.
But he co wrote an article in Nature. Uh that's

(15:45):
called robust research colin institutions must do their part for reproduceability.
So this kind of ties back into the reproducing things,
like we said earlier, and his whole ideas, you know
what good funding they should tie fund ing two good
institutional practices, like you shouldn't get the money if you
can't show that you're doing it right. Um. And he

(16:08):
said that would just weed out a lot of stuff.
Here's one staggering stat for reproducibility and small sample size.
Biomedical researchers for drug companies reported that of their only
of the papers that they publish are even reproducible and
that was like an insider stat and it doesn't matter.

(16:29):
They's the drugs are still going to market. Yeah, which
is that's a really good example of why this does
matter to the average person. You know, like if you
hear something like um uh, monkeys like to cuddle with
one another because they are reminded of their mothers. Study shows, right,

(16:50):
do you can just be like, oh, that's great, I'm
going to share that on the internet. Doesn't really affect
you in any way. But when their studies being conducted
that are that are creating drugs that could kill you
or not treat you or that kind of thing, And
is it's attracting money and funding and that kind of stuff,

(17:10):
that's like that's harmful. Yeah. Absolutely. I found another survey
did you like that terrible study idea that it came
up like the monkey's like the cuddle A hundred and
forty trainees at the MD Anderson Cancer Center in Houston, Texas,

(17:31):
Thank you Houston for being so kind to us at
a recent show. They found that nearly a third of
these um trainees felt pressure to support their mentors work,
like to get ahead or not get fired. So that's
another issue. As you've got these trainees or residents, uh,
and you have these mentors, and even if you disagree

(17:52):
or don't think it's a great study, you're you're pressured
into just going along with it. I could see that
for sure. There's there seems to be a huge hierarchy.
And UM Science, Ye canna lab. You know you've got
the person who runs the lab. It's their lab, and
go against them. But there are people UM like Science
and Nature to great journals are updating their guidelines right now.

(18:13):
They're introducing checklists. UM Science hired statisticians to their panel
reviewing editors, not just other you know, peer reviewed like
they actually actually hard numbers people specifically, because that's a
the process that's a huge part of studies. It's like
these this mind breaking statistical analysis. It can be used

(18:36):
for good or ill. And I mean, I don't think
the average scientists necessarily is a whiz at that, although
it has to be part of training, but not necessarily.
And that's a different kind of beast altogether. UM stats,
we talked about it earlier. I took a stats class
in college. So much trouble that was awful at it.

(18:56):
It really just it's a special kind of is even mad. Yeah,
I didn't get it. I passed it, though I passed
it because my professor took pity on me. Um that
ulric Durna durnagal ulric arnaal Um. He is a he's

(19:20):
a big time crusader for his jam making sure that
science is good science. One of the things um he
crusades against is the idea of remembering that virginity study
where they just threw out anybody who had a violent
encounter for their first sexual experience. UM. Apparently that's a
big deal with animal studies as well. If you're studying

(19:41):
the effects of a drug or something like there was
there one in the article. Um. If you're studying the
effects of a stroke drug and you've got a control
group of mice that are taking the drug or that
aren't taking the drug, and then a test group that
are getting the drug. Um, and then like three mice
from the group die even though they're on the stroke drug.

(20:02):
They die of a massive stroke, and you just literally
and figuratively throw them out of the study, um, and
don't include them in the results. That changes the data.
And he's been on a peer of you on a
paper before He's like, no, this doesn't pass peer of you.
You can't just throw out what happened to these three rodents?
You started with ten, there's only seven reported in the end.

(20:24):
What happened to those three? And how many of them
just don't report the ten? They're like, oh, we only
started with seven. Were going, you know, well, I was
about to say I get the urge. I don't get
it because it's not right. But I think what happens
is you work so hard at something. Yeah, yeah, and
you're like, how can I just walk away from two
years of this because it didn't get a result? Okay?

(20:46):
The point of real science, though, you have to walk
away from it, Well, you have to publish that. And
that's the other thing too, And I guarantee scientists will say, hey, man,
try getting a negative paper published in a good journal.
These days, you don't want that kind of stuff. But
part of it also is I don't think it's enough
to just have to be published in like a journal.

(21:08):
You want to make the news cycle as well. That
makes it even better, right, Um, So, I think there's
a lot of factors involved, but ultimately, if you take
all that stuff away, if you take the culture away,
from it. You're if you get negative results, you're supposed
to publish that so that some other scientists can come
along and be like, oh, somebody else already did this
using these methods that I was going to use. I'm

(21:29):
not gonna waste two years of my career because somebody
else already did. Thank you, buddy for saving me this
time and trouble and effort to know that this does
not work. You've proven this doesn't work. When you start
to prove it does work, you actually proved it didn't work.
That's part of science. Yeah, And I wish there wasn't
a negative connotation to a negative result because to me,

(21:53):
it's the value is the same as proving something does
work as proving something doesn't work. Right. Again, it's just
not it's sexy. Yeah, but I'm not sexy either, so
maybe that's why I get it. Uh. Here's one that
I didn't know was a thing. Predatory publishing. You never
heard of this. So here's the scenario. You're a doctor
or a scientist and um, you get an email from

(22:17):
a journal that says, hey, you got anything interesting for us.
I've heard about your work, and you say, well, I
actually do I have this this study right here. They say, cool,
we'll publish it. You go, great, my career is taking off.
Then you get a bill he says, where's my three
grand for publishing your article? And you're like, I don't
owe you three grand, all right, give us two. And

(22:39):
you're like, I can't even give you two. And if
you fight them long enough, maybe they'll drop it and
never work with you again, or maybe it'll just be like, well,
we'll talk to your next quarter. Exactly. That's called predatory publishing,
and it's a I'm not sure how new it is,
maybe it's is it pretty new, but it's a thing
now where uh, you can pay essentially to get something published. Yes,

(23:07):
you can. Um it kind of it's kind of like
who's who in behavioral science is kind of thing you know. Um.
And apparently it's new because it's a result of open
source academic journals, which a lot of people push for,
including Aaron Schwartz very famously who like took a bunch
of academic articles and published him online and was prosecuted

(23:28):
heavily for it. Persecuted. You could even say, um, but
the idea that science is behind this paywall, which is
another great article from prisonomics by the way, Um, it
really just ticks a lot of people off. So they
started open source journals right and as a result, predatory
publishers came about and said, Okay, yeah, let's make this free,

(23:49):
but we need to make our money anyway, so we're
going to charge the academic who wrote the study for
publishing it. Well yeah, and and sometimes now it's just
a flat out scam operation. There's this guy named Jeffrey
Beale who is a research librarian. He is my new
hero because he's truly like one of these dudes that

(24:10):
has uh he's trying to make a difference and he's
not profiting from this, but he's spending a lot of
time by creating a list of of predatory publishers. Yea
a significant list too. Yeah, how many four thousand of
them right now? Um. Some of these companies flat out

(24:32):
lie like they're literally based out of Pakistan or Nigeria
and they say, no, we're in New York publisher so
it's just a flat out scam. Or they lie about
their review practices, um, like they might not have any
review practices and they straight up lie and say they do.
There was one called Scientific Journals International out of Minnesota

(24:53):
that he found out was just one guy, like literally
working out of his home, just live, being for articles,
charging to get them published, not reviewing anything, and just
saying I'm a journal. Yeah, I'm a scientific journal. He
shut it down apparently or tried to sell it. I
think he was found out. UM and this other one,

(25:16):
the International Journal of Engineering Research and Applications. They created
an award and then gave it to itself and even
modeled the award from an Australian TV award like the
physical statute. That's fascinating. I didn't make you do that.
I'm gonna give ourselves. Yeah, let's the best podcast in

(25:36):
the universal ward that it's gonna look like the Oscar. Yeah, okay,
the Oscar crusted with the Emmy. Uh. This other one,
med med No Publications actually confused the meaning of STM
Science Technology Medicine. They thought it meant sports technology in Medicine. Well,
a lot of UM science journalists or scientists too. But

(26:00):
UCH dogs like to send like gibberish articles into those
things to see if they published him, and sometimes they do.
Frequently they do. They sniff them off the case it's
the big time. How about that call back. It's been
a while. It needs to be a T shirt. Did
we take a break? Yeah, all right, we'll be back
and finish up right after this. Just like the number

(26:21):
of stars the sky so much, Sorry, George. So here's

(26:44):
a big one. You ever heard the term follow the
money m That's applicable to a lot of realms of
society and most certainly in journals. Um, if something looks hinky,
just do a little instigating and see who's sponsoring their work. Well,
especially if that person is like, no, everyone else is wrong, right,

(27:07):
climate change is not man made kind of thing. Sure.
You know, if you look at the where their finding
is coming from, you might be unsurprised to find that
it's coming from people who would benefit from the idea
that anthropogenic climate change isn't real. Yeah, well we might
as well talk about him Willie Soon. Yeah, Mr Soon?
Is he a doctor? He's a He's a physicist of

(27:30):
some sort. Yeah, all right, m M. I'm just gonna
say Mr or doctor Soon because I'm not positive. Uh.
He is one of a few people on the planet Earth, um,
professionals that is, who deny human climate change, human influence
climate change. Like you said, you said the fancier word

(27:52):
for it though, and anthropogenic. Yeah, it's a good word. Um.
And he works at the Harvard Smithsonian Center for Astrophysics.
So hey, he's with Harvard, he's got the cred, right right. Um.
Turns out when you look into where he's getting his funding,
he received one point two million dollars over the past
decade from Exxon Mobile, the Southern Company, the Cokes, and

(28:17):
the Koke brothers, their foundation, the Charles G. Coke Foundation.
Exn stopped in stopped funding him. But the bulk of
his money and his funding came and I'm sorry, I
forgot the American Petroleum Institute came from people who clearly
had a dog in this fight. And it's just how

(28:38):
can you trust this? You know? Yeah, well you trusted
because there's a guy and he has a PhD in
aerospace engineering by the way, all right, he's a doc.
He works with this um this organization, the Harvard Smithsonian
Center for Astrophysics, which is a legitimate place. Um. It
doesn't get any funding from Harvard, but it gets a
lot from NASA and from the Smithsonian well in Harvard's

(28:59):
very very clear to point this out when people ask
him about Willie Soon. Um, they're kind of like, well,
here's the quote, Willie Soon as a Smithsonian staff researcher
at Harvard Smithsonian Center for Astrophysics, a collaboration of the
Harvard College Observatory in the Smithsonian Astrophysical Observatory. Like they
just want to be real clear. Even though he uses

(29:19):
a Harvard email address, he's not our employee. No, but again,
he's getting lots of funding from NASA and lots of
funding from the Smithsonian. This guy, Um, if his scientific
beliefs are what they are and he's a smart guy,
then yeah, I don't know about like getting fired first saying,
you know, here's a paper on on the idea that

(29:42):
climate change is not human made. Yeah, he thinks it's
the Sun's fault. But he didn't he doesn't reveal in
any of his um conflicts of interest. Uh, that should
go at the end of the paper. He didn't reveal
where his funding was coming from. And I get the
oppression that in academia, if you're are totally cool with

(30:03):
everybody thinking like you're a shill, you can get away
with it. Right. Well, this stuff, a lot of this
stuff is not illegal, right, even predatory publishing is not illegal,
just unethical. And if you're counting on people to police
themselves with ethics, a lot of times will disappoint you.
The Heartland Institute gave Willie Soon a Courage Award, and

(30:27):
if you're not caring about what other scientists think about
if you've heard the Heartland Institute, you might remember them.
They are a conservative thank tank. You might remember them
in the nineties when they worked alongside Philip Morris to
deny the risks of secondhand smoke. Yeah, that's all chronicle.
In that book, I've talked about merchants of doubt that
really just a bunch of scientists, legitimate bona fide scientists

(30:50):
who are like up for for um, being bought by
groups like that said, it is sad um and the whole,
the whole thing is they're saying like, well, you can't
say without beyond a shadow of a doubt, with absolute certainty,
that that's the case. And science is like, no, science

(31:11):
doesn't do that. Science doesn't do absolute certainty. But the
average person reading a newspaper sees that, oh you can't
stay with absolute certainty, Well then maybe it isn't man made.
And then there's that doubt that. The people just go
and get the money for for saying that, for writing
papers about it. It's millions of despicable. Yeah, it really is.
Um self reviewed. Uh you've heard of peer review. We've

(31:35):
talked about it quite a bit. Your reviews. When you
have a study and then one or more ideally more
of your peers reviews your study and says, you know what,
you had, best practices, you did it right. Um, it
was reproducible, you follow the scientific method. Um, I'm gonna
give it my stamp of approval and put my name
on it. Not literally or is it? I think so?

(31:55):
It says who reviewed it. I believe in the journal
when it's published, but not my name as the author
of study, you know what I mean? Um, and the
peer reviewer as a peer reviewer, and that's a wonderful thing.
But people have faked this and been their own peer reviewer,
which is not how it works. No, who's this guy? Uh? Well,

(32:19):
I'm terrible at pronouncing Korean names, so all apologies, but
I'm gonna say nung In Moon nice Dr Moon, I think, yeah,
let's call him Dr Moon. Okay, So Dr Moon um
worked on natural medicine. I believe, and was submitting all
these papers that were getting reviewed very quickly, because apparently

(32:41):
part of the process of peer reviews to say, this
paper is great, can you recommend some people in your
field that can review your paper? And Dr Moon said,
I sure can. He was on fire. Let me go
make up some people and make up some email addresses
that actually come to my inbox and just posed as
all of zone peer reviewers. He was lazy, though, is

(33:03):
the thing, like, I don't know that he would have
been found out if he hadn't been um careless. I
guess because he was returning the reviews within like twenty
four hours. Sometimes a peer review of like a real
um study should take I would guess weeks, if not months,
like the study the publication schedule for the average study

(33:28):
or paper, I don't think it's a very quick thing.
There's not a lot of quick turner, right, And this
guy was like twenty four hours. When they're like Dr Moon,
I see your paper was reviewed and accepted by Dr Mooney,
It's like I just added a Y to the end.
It seemed easy. Uh. If you google peer review fraud,

(33:49):
you will be shocked and how often this happens and
how many legit science publishers are having to retract studies? Uh,
And it doesn't mean they're bad. Um, they're getting duped
as well. But there's one based in Berlin that had
sixty four retractions because of fraudulent reviews. And they're just

(34:11):
one publisher of many. Every publisher out there probably has
been duped. Um. Maybe not everyone, I'm surmising that, but
it's a big problem. We should study on it. I'll
review it. It'll end up in the headlines. Now, every
single publisher duped, says Chuck. Uh. And speaking of um

(34:32):
the headlines, Chuck. One of the problems with science reporting
or reading science reporting is that what you usually are hearing,
especially if it's making a big splash, is what's called
the initial findings. Somebody carried out a study and this
is what they found, and it's amazing and mind blowing
and it um, it supports everything everyone's always known. But

(34:54):
now there's a scientific study that says, yes, that's the case.
And then if you wait a year or two, when
people follow up and reproduce the study and find that
it's actually not the case, it doesn't get reported on.
Usually Yeah, and and sometimes the science scientists or the
publisher is they're doing it right and they say initial findings,

(35:17):
but the public and sometimes even the reporter will say
initial findings. But we as a people that ingest this
stuff need to understand what that means, um, and the
fine print is always like you know, you know more
study is needed, but no one if it's something that
you want to be true, you'll just say, hey, look

(35:38):
at the study, right. You know it's brand new and
they need to study it for twenty more years, but hey,
look what it says. And the more the more you
start paying attention to this kind of thing, the more
kind of disdain you have for that kind of just
offhand um sensationalist science reporting. But you'll still get it

(36:00):
up in it. Like every once in a while, I'll
catch myself like saying something. You'd be like, oh, did
you hear this? And then as I'm saying it out loud,
I'm like, that's preposterous. Yeah, there's no way that's going
to pan out to be true. I got colleckbated, I know.
I mean, we we have to avoid this stuff. It's
stuff because we have our name on this podcast. But
luckily we've given ourselves the back door of saying, hey,

(36:23):
we make mistakes a lot. It's true though, we're not experts,
we're not scientists. Uh. And then finally we're gonna finish
up with the header on this one is it's a
cool story. And that's a big one because um, it's
not enough these days. And this all ties in with
media and how we read things as as people. But

(36:44):
it's not enough just to have a study that might
prove something. You have to wrap it up in a
nice package. Yeah, to deliver people get it in the
news cycle. And the cooler the better. Yep. It almost
doesn't matter about the science as far as the media
is concerned. They just want a good headline and a

(37:05):
scientist tools say, yeah, that's that's cool. Here's what I found.
This is going to change the world. Lockness monster is real.
This is a kind of ended up being depressing somehow.
Yeah not somehow. Yeah, like yeah, it's kind of depressing.
We'll figure it out, chuck, Well, we do our best.

(37:27):
I'll say that science will prevail, I hope. So. Uh,
if you want to know more about science and scientific
studies and research fraud and all that kind of stuff,
just type some random words into the search part how
stiff works dot com. See what comes up? And since
I said random, it's time for a listener mail. Oh no, oh, yeah,
you know what. It's time for administ all right, Josh,

(37:56):
administrative details. If you're new to the show, you don't
know what it is. That's the very clunky title. We're saying,
thank you too listeners who send us neat things. It
is clunky and generic and I've totally gotten used to
it by now. Well you're the one who made it
up to be clunky and generic, and it's stuck. Yeah.
So people send U stuff from time to time, and
it's just very kind of you to do so, Yes,

(38:17):
And we like to give shout outs, whether or not
it's just out of the goodness of your heart or
if you have a little small business that you're trying
to plug. Either way, it's a sneaky way of getting
it in there. Yeah, but I mean, I think we
we brought that on, didn't we didn't we say like,
if you have a small business and you send us something,
we'll we'll be happy to say something exactly thank you.
All right, So let's get it going here. We got

(38:38):
some coffee right from one thousand faces right here in Athens,
Georgia from Kayla. Yeah delicious, Yes it was. We also
got some other coffee too, from Jonathan at Steamworks Coffee.
He came up with a Josh and Chuck blend. Oh yeah,
it's pretty awesome. I believe it's available for sale to Yeah.
That Josh and Chuck blend is dark and bitter. Uh.

(39:02):
Jim Simmons, he's a retired teacher who sent us some
lovely handmade wooden bowls and a very nice handwritten letter,
which is always great. Thanks a lot, Jim. Uh. Let's
see Chamberlayne send us homemade pasta, including a delicious savory
pumpkin fettuccini. It was very nice. Yum. Jay Graft two

(39:24):
F's send us a postcard from a great wall of China.
It's kind of neat sometimes we get those postcards from
places we've talked about. I was like a thanks James
right here. Let's see the Hammer Press team. They sent
us a bunch of Mother's Day cards that are wonderful. Oh,
those are really nice, really great. You should check him out.
The Hammer Press team. Yeah. Uh, Misty, Billie and Jessica.

(39:46):
They sent us a care package of a lot of things.
There were some cookies, um, including one of my favorite
white chocolate dipped rits and peanut butter crackers. Oh yeah, man,
I love those homemade, right yeah. And uh, then some
seventies Macromay for you, along with seventies Macromay magazine because
you're obsessed with Macromay. We have a Macromay plant holder

(40:09):
hanging from my UM microphone arm having a coffee mug
sent to us by Joe and Lynda heckt, that's right,
and it has some pens in it. Uh. And they
also sent us a misty Billy and Jessica a lovely
little hand drawn picture of us with their family, which
was so sweet. That's very awesome. Um. We've said it before,
we'll say it again. Huge. Thank you to Jim Ruwaine

(40:31):
I believe that's how you say his name. And the
Crown Royal people for sending us all the Crown Royal
we are running low. Uh. Mark Silberg at the Rocky
Mountain Institute send us a book called Reinventing Fire. They're
great out there, man, they know what they're talking about.
And I think it's Reinventing Fire. Colon Bold Businesses Bowld

(40:51):
Business solutions for the new energy era. Yeah, they're they're
basically like um green energy observers, but I think they
they they're experts in like all sectors of energy, but
they have a focus on green energy, which is awesome. Yeah,
they're pretty cool. Um john whose wife makes Delightfully Delicious
doggie treats. Delightfully Delicious is the name of the company.

(41:12):
There's no artificial colors or flavors. And they got um
sweet little momo hooked on sweet potato dog treats. I
thought you're gonna say, hooked on the junk, the the
sweet potato junk. She's crazy cuckoo for sweet potatoes. Nice.
Oh man, that's good for a dog too. It is
very stratt Johnson send us his band's LP and if
you're in a band, your name is Strat And that's

(41:34):
pretty cool. Uh die omeya still mhm. I think that
was great. Yeah, I'm not sure if I pronounce it right.
D I O M A E A. Uh, Frederick, this
is long overdue. Frederick at the store one five to
one store dot com. Send us some awesome low profile

(41:57):
cork iPhone cases and passport holders. And I was telling
him Jerry Walks around with her iPhone in the cork
holder and it looks pretty sweet. Oh yeah, so he said, AWESO,
I'm glad to hear Joe and Holly Harper send is
some really cool three D printed stuff. You should know,
things like s Y s k uh, you know, like

(42:18):
a little desk oh as, like after Robert Indiana's love sculpture. Yeah,
that's what I couldn't think of what that was from. Yeah,
it's awesome. It's really neat and like a bracelet um
made out of stuff you should know, three D carved
like plastics, really neat. Yeah, they did some good stuff,
So thanks Joe and Holly Harper for that. And then
last for this one, we got a postcard from Yosemite

(42:40):
National Park from Laura Jackson, so thanks a lot for that.
Thanks to everybody who sends this stuff. It's nice to
know we're thought of and we appreciate it. Yeah. We're
gonna finish up with another set on the next episode
of Administrative Details. You got anything else, No, that's it.
Oh yeah. If you guys want to hang out with
us on social media, you can go to s Y

(43:01):
s K Podcast on Twitter or on Instagram. You can
hang out with us at Facebook dot com slash stuff
you Should Know. You can send us an email to
stuff podcast at how stuff Works dot com and has
always joined us at home on the web. Stuff you
Should Know dot com. Stuff you Should Know is a
production of iHeart Radios. How stuff Works for more podcasts

(43:23):
for my Heart Radio because at the iHeart Radio app,
Apple Podcasts, or wherever you listen to your favorite shows

Stuff You Should Know News

Advertise With Us

Follow Us On

Hosts And Creators

Chuck Bryant

Chuck Bryant

Josh Clark

Josh Clark

Show Links

AboutOrder Our BookStoreSYSK ArmyRSS

Popular Podcasts

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.