Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:03):
You're listening to Why we do what we do. Welcome
to Why we do what we do. I am I
am the host who's master's thesis is still sitting in
a file drawer in.
Speaker 2 (00:23):
My name is Abraham, and i'm reviewer to host Shane
Oh good one.
Speaker 1 (00:29):
We are a psychology podcast. We talk about the things
that humans and non human animals do and we try
to bring a nice skeptical scientific lens to our understanding
of behavior that shapes the world and which we live.
And thank you so much for joining us. We're happy
to have you here.
Speaker 2 (00:45):
Yeah, for sure. We're looking forward to hearing your feedback
on this one, because this episode is about the peer
review process. For my science nerds out there and for
my researchers, and for all the people who have written
a manuscript and turned it in and then got it rejected.
This episode is for you.
Speaker 1 (01:01):
That's right, or got accepted, that's.
Speaker 2 (01:02):
Not accepted, that's cool too, Yeah, yeah.
Speaker 1 (01:04):
But this is about the process. You know, sometimes we
do that, we talk about processes that we put in
place because it helps us be better as a species,
and this is one of those things. And if you're
joining us for the first time, we hope that you
enjoy this. But as we said, we thrive on feedback.
You go ahead and leave us that review, because we're
talking about reviews today. You can also leave us five stars.
You can just write to us. That's all good. If
(01:25):
you're a returning listener, then welcome back. You may also
leave us a review, give us some feedback, tell us
what you think right to us, leave us a rating,
go tell a friend. We'll talk more about the ways
you can support us. Either way, we're happy that you
found your way here, and we hope that you enjoy
what you hear in this episode today I started alluding
to this but didn't get through it. You can join
us on Patreon, pick up some merch like subscribe, tell
a friend. But no matter how you got here, and
(01:47):
no matter what you choose to say in your review,
we are going to start by wishing you all a
happy Banana Lover's Day, because it is August twenty seventh,
and you know everyone knows the August twenty seventh is
Banana Lover's Day.
Speaker 2 (01:58):
Obviously, also just because day, why just because.
Speaker 1 (02:04):
Just because bananas bananas. It is kiss me Day with
consent please, But if that's your thing, then sure, yeah.
Speaker 2 (02:10):
Yeah, yeah, absolutely, It's National Petroleum Day.
Speaker 1 (02:14):
It is National Pots de Creme Day, sure, which I
think that means something to do with potatoes.
Speaker 2 (02:22):
I don't know, maybe that's what I would assume, okay,
or just milk pots.
Speaker 1 (02:28):
It is, uh sure.
Speaker 2 (02:30):
It is Slow Art Day as opposed to fast art.
Speaker 1 (02:33):
Right, yeah, as opposed to fast art exactly, so, Bob Ross,
I think is where that is headed. It is Tarzan Day.
Not sure exactly what that means here, and I'm not
gonna look it up, but if you celebrate, then we acknowledge.
Speaker 2 (02:44):
You sure get your loin cloths out. I guess it
is the Duchess who wasn't day?
Speaker 1 (02:50):
Lots of those? Sure, most people. Maybe it is tug
of War Day. So if you're into that sort of thing,
go pull on a rope.
Speaker 2 (03:00):
Yeh, go pull on a rope. Maybe that's gonna be
my new insult. Eh yeah, You're go pull on a
rope and see what people say. It's willing to lend
a hand Wednesday.
Speaker 1 (03:09):
Oh that's so nice. I love that that that's a
holiday that clearly has to fall on a day of
the week. What in fact, we're probably gonna say this
one every year because we published on Wednesdays. Yeah, exactly,
and this one has to fall on a Wednesday or
else it only happens every seven years. Yeah. It is
World Rock paper Scissors Day. Oh man, competition.
Speaker 2 (03:29):
Get ready the great equalizer that is pop rock paper scissors.
Speaker 1 (03:33):
We should do a mini on rock paper scissors. That
never occurred to me, But there's like I know that
there there's like actual math and science behind, like what
throws are most likely to happen, and like where this originated. Yeah,
I'm gonna write it down.
Speaker 2 (03:44):
In variations because then you could play rock paper scisors
spock lizard stuff like that. Wow.
Speaker 1 (03:49):
I did not know that you can actually also do
multiplayer rock paper scis. I don't know if you know this.
Speaker 2 (03:53):
I didn't know that. There's also there's a version of
rock paper scissors that I believe has like twenty different
signs you could throw.
Speaker 1 (03:59):
Whoa, oh yeah, okay, many it is.
Speaker 2 (04:02):
And it's like it all works anyway, all right. It
is a Be Kind to Humankind week.
Speaker 1 (04:07):
Ooh, I love that it's so much. Yes, please be
kind to humankind. That's great. Mostly it is break safety week,
and this is not break as in shatter something. It's
breaks in b R A K E. So break doesn't stop,
And I guess it's saying stop yeah.
Speaker 2 (04:22):
Stop week, yeah, just yeah stop. Well, I guess like
make sure you stop. Well yeah, make sure the equipment
you have Yeah, make sure the equipment do you have?
Stops like it's supposed to. It's a Health Unit Coordinators week.
Speaker 1 (04:33):
Good good. It's National Composites Week.
Speaker 2 (04:36):
It is. It's National Safe at Home Week.
Speaker 1 (04:39):
It's Robbie a'l awaal week. I hope I said that.
Speaker 2 (04:42):
Right, Yeah, And it is World water Week like water
World Week is like yeah, Kevin country movies, specifically water World,
and not any of his other movies, which he was
much better in All the Man, not the Postman, not Yellowstone,
the Field of Dreams, Field of Dreams. He was a
Field dream That's that was Kevin Costner, right now, That
was Kevin Costner, not okay, Man of Steel.
Speaker 1 (05:05):
Oh yeah, I forgot he was in that. Yeah, wow, okay,
all right, great. So anyway, those are those are the
days of the week that well, I guess the holidays
that land on the day or the week that that
this publication comes out. Wow, that was a weird stream
of conscience. Consciousness ramble that this went on. Those are holidays, Yeah,
(05:27):
those are holidays that happen to occur. When we publish
these episodes, we're acknowledging them. Yes, we're actually here to
talk about other things. As we said, our primary target
today is the process by which we evaluate one another's
work when we're disseminating information, a process I think we
could probably use in many, many, many more spaces than
is currently being used, because we seem to live in
the world of dumb right now. Yes, so I really
(05:49):
love it if we could return to the world of
less dumb than currently exists.
Speaker 2 (05:53):
Yes, so I think we're going to talk about this,
and I will say too, like, we're going to talk
about this from the perspective of like submitting scientific manuscripts, right,
and what this does. Yeah, So just so everybody's got
that too.
Speaker 1 (06:05):
So we always need to start by defining our terms.
Perhaps you've heard this sane peer review. I know we've
mentioned it on here before, So if you are a
regular listener, you've heard us say it almost certainly, and
probably outside of the context of this episode you have
heard it as well. But we definitely want to define
it so that we're all on the same page before
we jump into this. So, peer review is the evaluation
(06:28):
of scientific, academic, or professional work by others working in
the same field.
Speaker 2 (06:34):
Yes, and so for those who are uninitiated, the peer
review process is an evaluation process that ensures that research
meets a certain level of rigor and quality and validity
prior to publication. That's the goal of it at least, okay,
And this is often used in the research space where
new research is submitted to a journal for review or
a board for review before it is accepted for publication.
(06:57):
And usually it involves some reviewers that takes some time
to view it and provide feedback, comments and decide whether
it's accepted or rejected or accepted with revisions, or whatever
it might be.
Speaker 1 (07:07):
And this is a process. The whole point of what
we're going to be discussing today is to scrutinize one
another's work such that we feel like we have the
most rigorous way of interpreting and disseminating information. Yes, so
that those who then read and access that information will
(07:28):
have the clearest, most accurate picture of what we now
know as a field. And that happens every single time
one of these manuscripts is submitted. So this is part
of a self regulation process within certain fields that ensures
that a good research is published and b bad research
does not make it to publication. So this is we're
(07:51):
trying to like put in a filter here, and I
think this is why I say we could use this
in more spaces in our world today, as like where
is the filter that allows it so that we get
as much correct information as possible and we do not
get in correct information and so and we want to
talk about like what kinds of things peer reviews are
(08:12):
looking for, like how this process works. But essentially that's
the thing to know is like this is the filter
that prevents the crap from getting out there and like
has people wasting time on stuff that is ultimately proves
to be incorrect. And it doesn't always work, but it
works a lot better than just saying like publish whatever
you want. Nobody cares like it'll be great.
Speaker 2 (08:31):
Yeah, yeah, yeah exactly. And we're gonna talk about that too,
Like we're talking about like you know, yeah, sure it's
not a perfect process, but it is better. It's the
best thing we got right now given the circumstances. So
but we'll we'll talk more about that too. So the
peer review process is, in its modern iteration, is a
fairly new thing, but the history of the peer review
process may be as old as the nine hundred cell,
(08:54):
specifically where a physician named Ali al Rawi of Syria
describe the process in his book Ethics of the Physician.
He explained that quote, physicians must take notes describing the
state of their patients medical conditions upon each visit. Following treatment,
the notes were scrutinized by a local medical council to
determine whether the physician had met the required standards of
(09:16):
medical care. So this was the end quote, and that
was like the official the first description of what a
peer review process might actually look like.
Speaker 1 (09:24):
Man, we talked about this before, but it would be
so interesting to do some history around the scholarly work
of the Middle East. Yes, because before they entered what
I would describe as sort of the dark ages of
like religious zealousness, they were like the world hub of
(09:45):
scientific and academic achievements. So much of what the world
has gained and learned came out of the brilliant scientists
that lived in that part of the world, who were
doing good work before the sort of religious zealots took
over and plunged it into the dark ages they seem
to be in right now. Yeah, yeah, and yes I
am firing shots at you people over there who are
like deliberately censoring information and making it impossible for people
(10:08):
to do research and things like that. Yeah, because you
are standing in the way of the progress of mankind,
and therefore you are my enemy. You're not Miami or
the enemy of mankind, but as an arborer of people
speaking on behalf of the process of bettering the human condition,
I do not salute you. You're a butt.
Speaker 2 (10:24):
Yeah, yeah, yeah. Anybody who suppresses knowledge is a butt.
Speaker 1 (10:27):
Yes, yes, you're a butt. Anyway, we need to continue
expanding upon the point that you were making here, which
was that this old this came from nine hundred CE possibly,
and that's amazing, like really incredible that they were engineering
that level of brilliance that long ago. That's really awesome. Again,
just speaks to the quality of the people that were
doing that kind of work and which was really awesome.
(10:49):
And it goes on to describe so you were describing
this this medical text, right, the doctor who would write
these notes, and then those notes would be scrutinized by
a panel to sort of give quality assurance, if you will.
And it also went on to describe what happens if
treatment is deemed unacceptable, which might be one of the
first descriptions of malpractice lawsuits as well. Yeah, speaking to
(11:09):
like you are in fact making decisions that are on
the best interests of the client based on current knowledge
and that like understanding that that knowledge is likely going
to change, and that's again that's part of the process.
Speaker 2 (11:20):
Yeah, it's super interesting at this point in time. You know,
we've got history going on. There's a lot of things happening,
and then comes the printing press, and so the printing
press shows up and documents are made more accessible than
they ever have been to the public, and that includes ads.
Speaker 1 (11:43):
All right, we're back. And I don't know if people
can tell from listening to this, but part of the
joy of these recordings is us trying to surprise each
other with ad segues that are like not scripted or planned.
I mean, we know, like about when we want ads
to happen. But anyway, that that one got me.
Speaker 2 (12:01):
Yeah, it was good stuff.
Speaker 1 (12:03):
Which is good. Yeah, Okay, anyway, so we were talking
about the printing press, so please continue.
Speaker 2 (12:07):
Yah, yeah, yeah, So the printing press shows up and
documents are made more accessible to the general public. But
as a result, peer review and editing became increasingly more
common because you were starting to see more publications in
scientific journals and scientific research and dissemination. So this peer
review process in the form of editing was it started
to become more common. The evaluation of science wasn't formally
written about or codified in any sort of way as
(12:29):
we understand it today, Like, it wasn't really systemized just
yet until Francis Bacon shows up with Novum Organum, and
so he starts to kind of outline yeah, exactly. He
starts to outline what a peer review process could and
should look like on its like most rudimentary terms awesome.
Speaker 1 (12:46):
So in this work he described what has since become
the very first universally accepted method for new science, shaping
the scientific method as we know it today. In this process, though,
the peer review system was only thought to be an
editor's job. So the person there's like, you still have
a filter process, but it's you have someone who's more
(13:06):
or less in charge of I guess you would say,
just like the quality of like the writing and everything. Yeah,
it was not really a validity process. It was like,
let's make sure that this is clear. It'll make it
out to the audience, like it's written in such a
way that like it makes sense to me, and like
it's having someone look over it and just say like okay, yeah,
I get it, yeah that makes sense, or maybe fix
(13:26):
this part here I don't really understand what's being said.
Did not really include at that point still the level
of scrutiny of like bringing in other experts to really
help hone it. But that's okay, like this is this
is first steps here that are really important.
Speaker 2 (13:39):
Yeah, for sure. So this goes on for a while
and then World War Two shows up and disrupts everything.
World War two the worst of the world wars. So, yeah,
nobody likes a sequel.
Speaker 1 (13:52):
The think, particularly the current single we find ourselves.
Speaker 2 (13:55):
In single right now is not great right now?
Speaker 1 (13:57):
Yeah?
Speaker 2 (13:58):
Yeah. No, So here's the thing, World War two. Everybody
talks about how disruptive it was on industries, on innovation,
on all this stuff, and specifically you know, like military
expertise and whatnot, But they don't really talk about how
World War two was significant in that it increased the
amount of scientific research that we were doing, and in
lots of different areas, not just social sciences, but in physics,
(14:21):
in engineering, and all these different areas that we really
need to innovate, and so we see this huge boom
of new research in not only engineering and physics, but
in social sciences. In the medical field, you start seeing
the emergence of psychological services start to take place in
like in better ways. You start seeing medical care improve,
you start seeing nursing improved. So you see all this
(14:42):
stuff kind of happen. And the peer review process was
further systematized during this time due to the increase in
research to give it the much needed critiques and validity
that it was that we were looking for for scientific
We started seeing science become more rigorous during this time.
Speaker 1 (14:58):
It is an unfortunate reality of war and kind of
aggression more broadly, that it entails innovation. That is often
I think there are military leaders who very pragmatically understand
the value of science as it relates to technologies of war,
and so a lot of times when wars happen, innovation
(15:20):
can really boom, which means there is a huge boom
in scientific process because they're really interested in pushing the
next super deadly weapon.
Speaker 2 (15:29):
You know.
Speaker 1 (15:29):
That's how we ended up with things like atomic weapons
and nuclear weapons, lasers and lasers, yeah, bombs that could
completely destroy the entire planet. Essentially was scientists were pushed
to really figure out how these processes work and use
and leverage that knowledge to engineer greater and greater weapons.
And so it's unfortunate. I do not think that it
(15:51):
was worth it to have science be pushed that far
that fast, Like I would rather have said, let's not
have the war and not gain that scientific sure innovation,
but we did get like it's complicated, it's nuanced. We
do now have that scientific innovation. So there's really no
putting the genie back in the bottle, as they say, right, right,
But it happens a lot like pretty much every time
there's been some major war in the last several hundred
(16:12):
years at least, it has come with major technological revolutions
that have been spurred by employing people who use science
and scientific methods to innovate in that space.
Speaker 2 (16:23):
Right, So that brings us to at current day. Right,
the current peer review system is pretty similar to what
we started to see back then, it's become more refined
and more polished, and maybe there's a little bit more
nuanced and with technology systems, it's made it a little
bit easier to be anonymous. But like, what we're kind
of discovering now is that the peer review process is
(16:43):
not that dissimilar to what we were starting to find
to merge during and after World War Two. But the
question that everybody has, because most of us are anonymous,
is who are the reviewers? So I wrote this as
a kindergarten cop arnold sort who.
Speaker 1 (17:00):
Are the reviewers?
Speaker 2 (17:01):
What did they do? So I think it's important to understand,
like how how reviewers show up, what they do, who
they are, because they are kind of like the crux
of the peer review process, right, they're the first part.
Speaker 1 (17:14):
Yeah. I mean if you hear the term peer review
of the first word and there is peer like it
implies that the reviewers are peers of the person who's
submitting the manuscripts. Right most of the time, yes, but
there are some interesting critiques about that. Most of the
time they call them experts in the field, but they're
not actually are peers. These might be folks who are
highly qualified researchers. You got to imagine like they're even
(17:38):
though they're not necessarily peers. There may be really good
at some particular aspect of science or this method or
something that's related, but not necessarily directly in the same
line of work. Maybe they're not directly colleagues, and actually
it's better that they don't know each other. We'll get
into that in a little bit. But those are sort
of the peers. But of course we then must ask,
all right, so you decide you're going to get these
(17:58):
experts to take a look at this manuscript and provide
their own critique of that manuscript. What defines an expert though.
Speaker 2 (18:06):
Yeah, and that is a great question that we are
not going to answer today. It's a long standing discussion overall.
But let's talk about like this, what the peer reviewer does, right,
like in kind of like what they are supposed to do.
Like you, it's important to understand and not to expect
that the peer reviewer knows every single thing about a subject,
right like we should. They should have a general understanding,
(18:26):
they should have something like that. But like what they're
reviewing is new research, so they should at very least
have some information about the topic at hand. But generally speaking,
a peer reviewer is anyone who has competence and expertise
in the subject that the journal covers, not necessarily the
article itself, but whatever scope of the journal it is. Like, so,
(18:47):
for example, journal behavior analysis is going to have behavior
analysts generally speaking as the reviewers. They're not going to
have social workers or social psychologists as the reviewers, or
a physicist be the reviewer, right like, not generally speaking,
Like there may be some exceptions.
Speaker 1 (19:03):
To the rule, or conspiracy theorists or.
Speaker 2 (19:05):
Conspiracy theorists, we certainly won't have those all the time. Yeah,
unless the behavior analyst is a conspiracy theorist. And that's
kind of a fun that's a fun can unpack. But
generally speaking, like a journal will have a particular viewpoint
in a particular scope, and the reviewers at that journal
will meet that scope in some way.
Speaker 1 (19:23):
And reviewers, I mean, who makes up this pool of people.
They can it can be quite variable. They can vary
in age and experience to some degree. And it is
reported that young reviewers actually often provide better, higher quality feedback.
It's not always the case, but a lot of times
these are people who are fresh out of classes. They
are like really in contact with a lot of the
(19:43):
recent work that's been done. They often are like motivated
and excited and like ready to like lend some time
to these things, so they might provide higher quality feedback.
They might also not provide verigh quality feedback because they
don't necessarily know how to They don't know what to
look for, they don't know how to comment about the
thing that they're looking for. So it is actually a
(20:04):
process should also entail some amount of mentorship so that
you can become a better reviewer. But the point being
is that, like you, just because someone is relatively new,
it does not necessarily mean that they will be bad
at reviewing. It also doesn't mean that they'll be good,
but like it does seem to be that they may
provide higher quality, better feedback.
Speaker 2 (20:23):
Yeah, and also spoiler alert, older reviewers also provide bad feedback,
So like it happens too, like not always, but like
it is it is incumbent on the reviewer, and we'll
talk about kind of the process with that too. So yeah,
now reviewers will review approximately eight papers a year. That's
usually like as somebody who is like a consistent reviewer
will review about eight papers a year on average. Each
(20:46):
reviewer usually belongs to a pool of researchers or reviewers
for a designated journal or multiple journals, and may be
selected based upon their area of competence or availability. Oftentimes,
when I get selected to review, it's based on what
my research topics are, right, Yeah, so like so somebody
else said, it's like you were selected as a possible
reviewer because you have studied this or you have talked
(21:08):
about the subject. Would you be interested in reviewing? And
so that's usually when I get those I don't get
it because I'm just an available.
Speaker 1 (21:13):
Reviewer, right, And generally speaking, the process goes like this.
You will receive them manuscript, usually deidentified, that's more and
more common anymore, which means that you don't know who
the authors of that manuscript are. You often don't know
the affiliation of those authors, like do they belong to university?
Like what grip they belong to you? You won't get
that information. You'll just have the title and the rest
(21:35):
of the paper and no names. And then what you
do is you review the full document and they're going
to look for things such as alignment, quality of writing, design, selection,
treatment effects are going to look at the methods and
scrutinize those to ensure that the methods would allow for
an analysis inter interpretation of the data. Some very ambitious
reviewers might even like crunch the numbers themselves, the double
(21:56):
check the numbers against what the authors purport. They if
this is one that has a lot of numbers in it,
which is not always the case. They'll look for coherence.
They'll look for just basically anything that is. They look
for consistent, high quality information to validate the manuscript and
the things that are being said in that manuscript. And
they're thinking about if this goes to publication and this
(22:20):
ends up in a journal, will the readers of this
journal be able to walk away with new useful information?
Right because it is conveyed in such a way that
it is it is as correct as we can as
we understand it to be based on current information. It
is clearly written, and it has all of the support
needed for someone to read this and walk away with
something of value. And in many cases, the point of
(22:41):
the reviewer might also be oriented to can this be replicated?
Like is this written in such a way that someone
else can take this and that will further inspire additional
research to be concluded, such as research into high quality ads. Okay,
(23:03):
we're back. Ads are not. I mean, people do actually
do research into ads, but that's not what's happening in
the peer review. Yeah, that wasn't what I was gonna say.
That's just a way to segue.
Speaker 2 (23:13):
Yeah, no, no, no, it was good. It was good.
That was good. I like it. So another thing that
you need to know about peer reviewers, because I think
this is something that always comes up as part of
the conversation, is like, you know, what are people's affiliations,
what are their motivations? Peer reviewers don't get paid to review.
Speaker 1 (23:27):
Yeah, it's volunteer.
Speaker 2 (23:28):
It's volunteer. It's completely free labor. Also, I think it's
really important to note and it point out that for
the most part, research is also a free labor exercise.
You might work at a university that does pay for
your role, and your role is to publish research and
write research and stuff like that. Yeah, but like generally speaking,
like if as me as like working in the private sector,
(23:49):
if I wanted to do research, I would not get
paid for it. Like I would have to go do
the research and it would be free labor right, so
like kind of a unique thing. But peer reviewers specifically
what we're talking about, they don't get paid to review
any articles, and so it's a service that is offered
to the field by the field to self regulate the field,
and at least in behavior analysis, that's really all we do. Yeah,
so it's kind of a fun it's kind of an
(24:11):
interesting thing. So you have to be like really invested
in wanting to do this.
Speaker 1 (24:14):
It's important if you think about, like what could happen
in sort of the space of corruption for people who
are getting paid to do this. They could feel like
if they're getting paid, they're obligated to say yay or
nay to greenlighting something as being published. Sure, they could
feel obligated to produce a certain amount, like not very
(24:34):
much or not very little, like it will alter the
kind of work that they'll be doing. So in order
to avoid any complication that comes from specifically skewing motivation.
By offering financial incentives, you get people who just want
to help the progress of science. And that's pretty much
the entire process as far as I understand it. In
(24:55):
the majority of cases, editors do not get editors or
journals do get paid. Sorry, But like the editors, the
action editor who oversees the review does not get paid,
and the reviewers themselves do not get paid. And in
many cases the authors do not necessarily get paid for
the publication. And actually many cases they have to pay
if they want it to be like open access, yeah,
that other people can read it and not have to
(25:16):
pay for it.
Speaker 2 (25:16):
They don't even own their own publication.
Speaker 1 (25:18):
Yeah yeah, yeah, they don't own it anymore. They basically
give it up to the publisher. In that particular instance.
There are what are called pay to publish journals. They're
highly suspect. And this is the reason that like this
isn't a pay thing is because there are journals where
like if you pay them, they will publish your work
and it might go through like the veneer of like
(25:41):
the image the process of what looks like peer review.
But if you pay them to publish it, they're going
to publish it. And you got to believe that, like,
if you paid a journal to publish your work and
it's not going to receive the level of scrutiny that
like a scientific journal should be receiving and would otherwise
be receiving, that that is something where you are clearly
motivated to get that work out there without it being scrutinized,
(26:02):
which means that again, your work is suspect. So we
should be pretty distrusting of pay to print journals instead
of like a scientific space.
Speaker 2 (26:11):
Yeah, we are going to talk about predatory journalis in
a little bit too because of that exact issue. Right.
Another thing that I think is really important and I
think you and I could probably speak on this a
little bit. When it comes to peer reviewers and the
obligation or the task of reviewing a manuscript. You might
be going, sure, like, there's probably a criterion that you follow,
or you know, some kind of checklist that you might
(26:31):
look for, And the answer is no, there's not.
Speaker 1 (26:35):
Nope.
Speaker 2 (26:35):
I have collaborated on one journal that had something like that,
and I was also we started the null Hypothesis Journal
seven Dimensions, and we had set up some criterion for
what to look for too, But for all the reviewers,
we had like some checklists to kind of look for
certain things that fit the value of the journal and
stuff like that. But yeah, for the most part, I've
never reviewed for a journal, at least in behavior analysis,
(26:57):
that has had any sort of review criterion, any sort
of even training on how to do it. Like it's
just kind of like, hey, you can review this, and
we go okay, and then that's that's usually the extent
of what we what we're trained to do in that.
But I don't know, do you ever have you ever
seen any sort of formal criterion in your review processes
at all? I never have.
Speaker 1 (27:17):
And actually I was asked to do a review before
I had ever had any kind of like, as you said,
training is not something that is typically offered. And so
I did my best, but I went it sort of
opened my eyes to again I didn't know what I
didn't know, and I didn't realize how like little I
understood about that process. And so I went to other
professors and sort of mentors in grad school and I said,
(27:40):
the next time that you are reviewing or you're an
editor for an article, would you let me do a
mock review and you can help teach me what I
should be doing in my review. And so I got
like three or four opportunities where they basically gave me
a real manuscript that was it either had already gone
through the whole process or it was actually go through
the process, and then I would write my own review,
(28:03):
and then they would give me feedback on my review
and help sort of shape for me. They would show
me their reviews that they had done, so I had
an idea of like where to start, and they would
give me feedback on what I had done, so that
I knew better how to submit better reviews. And that
was enormously enormously helpful, and I actually started to whenever
I have been a PCBA supervisor, I have when I've
(28:25):
given had the opportunity to review articles, I offered to
them say would you like to do this with me?
And like, you won't be a reviewer, but like, if
you would like to see this and write your own review,
I'll give you some reviews I've done. Then you can
have an opportunity to like practice this and get some experience,
and I'll do my best to give you some feedback.
Speaker 2 (28:44):
And I'm like, I'm not sure that I'm the best
reviewer of the world. I tried.
Speaker 1 (28:46):
I try to be, like, I really try to do
my best, but I think it was enormously helpful to
have practice opportunities with people who had done many of
these to help them sort of shape my sort of
repertoire around this.
Speaker 2 (28:57):
Yeah, I love that. I love that so much. Yeah,
now we know who's the who, who's the who's who,
Who's the who's who? We know the who's who? We
know we know this is an episode about owls now Surprise. No,
so we know who's who that's involved in doing this.
You've got editors, you've got the journal, you get the
peer reviewers. But let's talk about the types of peer
(29:18):
review that are done, because I think this is the
part that people don't realize that there are multiple variations
of this theme of how we can review stuff. And
the first one is called a single anonymized review. And
in this type of review, the author does not know
who the reviewers are. So the author will submit their
manuscript they don't know who's going to review it. And
(29:39):
this is actually pretty commonly used generally speaking. But the
thing that's kind of unique about this is that the
authors don't know who the reviewers are, but the reviewers
know who the author is, right, Because that is that
is the difference here, Like, that's what the single anonymized
part of this is the only person that's anonymized is
the author that the reviewers. I'm sorry is the reviewers.
(30:02):
And so this is great for reviewers because they don't
have to worry about softening their feedback, but they are
going to maybe provide bias feedback as a result of
knowing who the authors are exactly. So there is like
a little bit of a conundrum there.
Speaker 1 (30:13):
That actually was a lot of my early experience where
with single anonymized reviews, where if I was a reviewer,
I knew who the author was, or if I was
the author of the reviewers knew who I was, or
the other people who are also authors with me on
those and yeah, it does, I think have an influence
because if you see a name of someone who you
really respect who is an author on a paper, and
you're like, well, man, I really want to I really
(30:33):
want to I acknowledge this person who's like an expert,
they really know what they're talking about. I really want
to make sure that I can walk away having said like,
let's accept this for publication, even if that's not like
an implicit or an explicit thought that you're having, that
bias can exist. And I'm certain certain that it does
because I had that experience as a reviewer when I
(30:54):
saw a paper and I got to see how the
authors on who are on that, and I was like, wow, this,
like I know these people there are like really respected
people in the field.
Speaker 2 (31:01):
Right.
Speaker 1 (31:01):
Alternatively, you might see one where you see an author
who you really don't respect and say like, I really
don't want this to go to publication because I hate
that guy. Yeah, or person you.
Speaker 2 (31:11):
Had somebody in mind. You're like, oh, I know who
that is. I know who I would not like the
good of.
Speaker 1 (31:15):
A life, And so it's it does have an opportunity
in there, which means that you then get double or
triple anonymized reviews. And this type of review means that
nobody knows who anyone is. The author doesn't know the reviewers,
the reviewer doesn't know the author. This is the most
common in social science spaces, and in the event of
the triple anonymized review, the author's reviewers, and even the
(31:35):
editors identities are all hidden from one another. And again,
the idea is to reduce bias as much as possible
in those spaces.
Speaker 2 (31:43):
Sure you've got open peer reviews. Open peer reviews are
super interesting. So for an open peer review, all identities
are known by all participants. The authors know, the reviewers, know,
and this is really good for transparency, Like when you
talk about kind of like building credibility of peer reviews,
like it's good to know who's reviewing what so that
you can identify possible conflicx of interest or what their
(32:05):
general expertise is. That's really helpful. And reviewers may not
actually prefer this though, because their comments can be directly
linked to them, which you know, I translate that as
they can't be mean. Yeah, yeah, because you always hear
about like reviewer two it's like kind of a jerk,
and I had that experience. I had the experience with
my first paper. Okay, the reviewer to was a complete asshole,
(32:26):
but we'll talk about what that looks like later too.
But like, open peer reviews are really interesting for that
reason that everybody knows who everybody is in that review process.
Speaker 1 (32:33):
I think you could reasonably argue, like the transparency piece
of this is really good and you're not wrong, but
I think it's just nuanced because yeah, as you said,
there then becomes the opportunity for someone to think like
I want these authors to think well of me, or
I don't want them to be mad at me, or yeah,
or I don't like them, and I want to make
my opinion of them known. But it changes the dynamic
(32:56):
of how you're going to interact with that work if
you're thinking more about the people involved than not about
necessarily the work itself. Sure, exactly Relatedly, there is transparent
peer review. This type of review includes the publication of
decision letters along with the article, and while it aligns
more with the single or double anonymized reviews, it does
include the full publication of the results of the peer
(33:16):
review for public review, meaning that they can see what
decisions went into allowing that manuscript to make it to publication,
like sort of how it was discussed and shaped among
the people who looked at that article.
Speaker 2 (33:29):
Right, you'll see the article comes out, it'll come out
in the journal. In some of the attachments you'll see
like the decision letters, the feedback that was provided by
all the reviewers, what edits were made, and any sort
of ads that are related to that article. Two.
Speaker 1 (33:48):
Okay, we're back. The next type we want to talk
about is called a collaborative review. This sounds kind of neat.
I've never done it, but I like the idea here.
So this is most of the time you like get
and a manuscript I'm imagining back in like the eighties
and nineties, you are like mailed physically in an envelope,
like a hard copy of yes, But in the digital age,
(34:11):
you mostly get like an email, but you don't see
who the other reviewers are. Sometimes you don't even know
how many other reviewers there are. And you just do
that document and then you submit it. You work on
the document. You don't need any that's all anyway, In
the collaborative review, rather than having reviewers independently review the manuscript,
collaborative reviewers may include a team of people who work
together and then together they review and approve or reject
(34:35):
those papers, and this usually results in a unified report
from the collaborative team. And in some cases, the team
of reviewers may work directly with the author to provide
feedback and support until the manuscript is ready for publication.
I do kind of see some benefit in that. I
do think that, like it's almost like you have like
(34:55):
review of the reviewers. In a way, you have sort
of some oversight of the the people who are participating
in that review process, which I could see a lot
of benefit in.
Speaker 2 (35:04):
Yeah, I think you get like some some like ioa
across reviewers, right, like you get some like at least
in some discussion too, Like you get like, can we
confirm that we all agree on this? Yes, A noe?
Why don't we agree on this? Let's go ahead and
look at evidence, Like there's some cool stuff that could
happen in those spaces exactly, Like I feel like I
would love that, and like I feel like everybody that
is in that space too, Like the reviewers learn more, Yeah,
(35:26):
because you're learning from other people and like seeing their viewpoints,
like you just mentioned all the feedback that you got
on the other stuff, Like I think that's a cool
way to go about that.
Speaker 1 (35:33):
Well, most of the time that I've done a review,
I later got to see the other reviewers review. Yes,
like the other people involved their reviews, and I always
find it really useful to look over the things that
they said, what the action editor may be glommed onto
and like they said, like this reviewer made this point
I think is really important for the development of this manuscript,
(35:53):
and that all like you kind of do all of
it in a collaborative review, you would get that whole
process all at once, right, where like I really honed
in onto this one piece that to me was really
important in how we move forward with shaping this manuscript,
and someone else actually really glommed onto a different piece
that they thought to them was important, and I was like, oh,
I hadn't considered that angle before, and likewise they may
(36:15):
have said, like, I wasn't considering your angle before. So
I do kind of like that idea, and I think
it'd be kind of cool. And it seems like you
sort of get a version of that a little bit.
And the way that at least I've participated in these,
which is as I said, you often get to see
like the action editor's summary of all the reviewers and
their sort of final recommendations, and you also get to
see the other reviewers recommendations as well.
Speaker 2 (36:37):
Yeah, absolutely, I love it. There's also post publication reviews,
and so this review is not like the initial peer review,
but this is an additional review that can be included
once a manuscript has been submitted and improved for publication.
So generally speaking of what happens is it allows additional
reviewers to come in and provide feedback regarding designs or
results or the writing, you know, the methods, and then
(36:58):
offer guidance on next step with an identified research line.
So what might happen is like, maybe an article is
published and they share some some cool insights. There are
some really great effects in their data, and then in
their results section they talk about a couple things, but
they don't talk about really much more like they're missing
some things. So an additional reviewer could come in and go,
these are all really great ideas. Also, if you're going
(37:20):
to do this, consider this, or maybe consider this X
question or why question, or you know, if you're going
to do this design again for replication purposes, maybe consider
this DV versus that DV, and so you can do
like a kind of an additional review that would be
like almost like supplemental or like an appendix to the
article the original article.
Speaker 1 (37:39):
I don't think I've seen that happen, but it seems
like it's kind of like polishing it up to sort
of like, Yeah, if you're in the post publication review part,
then really at the point where like you're not making
any substantive changes, just like how can we really make
this thing shine? Like whereas yeah, where we can sort
of augment what currently exists a little bit. Yeah, then
the final one here is a transferable peer review. In
(38:01):
some instances, a manuscript may be deemed incompatible with a journal,
and it makes a lot of sense. You'll actually see
this where there are journals that they have a specific mission,
like their goal is to public types of articles because
they know who their audience is, and they're catering to
an audience as looking for specific types of articles. So,
for example, if you're a journal who caters to clinicians
(38:23):
who go to this journal because they want practical, real
world solutions that they can put right into their daily practice,
then you don't want that journal to be publishing a
bunch of sort of whimsical conceptual articles that reflect on
things that don't really have any practical application, right, And
I do think that those are papers that can have
a lot of merit and value, but they would not
(38:44):
necessarily be appropriate for that journal. So in those cases,
the journal may elect to transfer the review to a
more compatible journal. So if I wanted to publish a
paper and behavior and social issues, but the theme of
the manuscript was more experimental, they could have the option
to transfer my review to the Journal of the Experimental
Analysis of Behavior, which is a much more experimentally oriented
(39:06):
publication outlet.
Speaker 2 (39:07):
Right, right, right, and nothing wrong with either of those journals,
just like just understanding that the purpose and the scope
of those journals. Right.
Speaker 1 (39:14):
Yeah.
Speaker 2 (39:14):
So both of us have published, and both of us
have been reviewers. I want to talk about that a
little bit because I think that it's valuable to have
that experience. So as an author, I have submitted I
believe six research journal research articles, and two of them
were accepted. The other four were rejected with feedback, okay,
And part of that is because my writing I've just
(39:34):
never been a very academic writer, Like I'm very like
aformal writer and just I've got to pull them up.
So I've gotten good feedback on those. But I will
say on the first article that I published, reviewer to
I want to bring this up because I think it's
important to understand again, there's no criterion for how reviewers
review papers. So the feedback that we got from this
(39:54):
reviewer was that we didn't understand behavior analysis wow, and
that they had cited a couple of things early in
the paper that they were like they never brought this
back up when we did, Like, so they were like
errors in their review. Yeah, And we were just kind
of like no, like I did talk about this, like
here's on this page. And actually our response to the
(40:15):
editor was we did respond to this, we did address this,
here's where they do this, this is this, And the
feedback was so blatantly bad that we went in and
we requested a new reviewer. Wow, we went, no, this
is not valid, Like, we are going to go to
the editor and we're going to ask for a new
reviewer because they they clearly did not read the paper
(40:36):
or they had a clear bias in their feedback. Yeah,
the editor granted us a new reviewer, and that new
reviewer went no, this is great, and and we were
able to publish. So like that was my first experience
as an author, wow, dealing with like a journal And
I was kind of like, wow, what an interesting thing,
like to find that like a peer is either biased
or incompetent or just didn't do the work right, like
(40:57):
a peer reviewer who's supposed to be a reviewer. So
like that, how was my experience there as an author,
but as a reviewer, Like, I've had the pleasure of
reading some journal articles that were like incredible topics, super intering,
interesting stuff, and they never get published because they got
rejected because of some missing element or because of something
like that. So like I keep all those because I
like to see, like if it shows up somewhere else eventually.
(41:19):
But right, you know, it's a bummer to read some
really great work and go oh, like another reviewer didn't
agree with me, and the editor ultimately didn't agree with
me on the review. So yeah, I don't know, have
you had any experiences like that?
Speaker 1 (41:30):
Yeah, yeah, So I think it's probably fairly common. I
could be wrong, but I think it's fairly common to
be an author before you're a reviewer. Yeah, And I
think it makes a lot of sense because once you
have become an author in a published space, then they
can look at that. The editors of the journal can
look at that and say, like, well, they know, like
they've already met the criteria to be published in this journal,
(41:52):
so they already have some familiarity at least with the
kinds of things that we're looking for that would make
a manuscript publishable. And so my first one was an author,
and I found the reviews to be extremely helpful, the
overwhelming majority. I will say that, like the worst reviews
are as you said, when the reviewers don't do the work.
The worst reviews are the ones that lack critical analysis.
(42:16):
I've seen like terrible, terrible reviews, and it was because
it was like a paragraph where the reviewer was like,
this was awesome, no notes, publish as is.
Speaker 2 (42:24):
Yeah.
Speaker 1 (42:25):
I was like, hold on, like, let's do some critical
analysis here, like what worked, what didn't, Like there's got
to be at least a little bit more besides saying
everything's great, please publish right, elaborate please. And I've also
seen terrible reviews similar to yours, where they were like
this didn't happen, this didn't happen, this didn't happen. This
was bad, like don't publish, And I was like, well,
(42:46):
those things did happen. I could point to you where
they happened and.
Speaker 2 (42:49):
Show you the specific evidence.
Speaker 1 (42:50):
Yeah, both where I was one of the reviewers and
I got to see like that reviewers like once that
it was in the sort of decision process, and where
I was the author and I had like reviewer who
was saying things like you didn't mention this, and I
was like, except for the three times I did here
on this page to this on page of this right,
And they're like, well and this doesn't make any sense
how you talk about this, And I was like, how
would you talk about it?
Speaker 2 (43:10):
Then?
Speaker 1 (43:10):
Could you please go provide some like I would like
to know how I could write this differently, and like
I've also man, I've read papers where I remember one
time I wrote to one of my sort of mentors
who was helping to guide me through this process. I
was like, can you reject an article on bad writing alone?
She was like, no, you really shouldn't. I was like,
(43:32):
all right, fine, but I actually so I wanted to share,
Like one of my experiences with getting my mentorship on
this is they oriented me as a reviewer to the
value that our job as the reviewer is to try
to help shepherd this manuscript to publication it can be yes,
(43:52):
And I love that that approach is to think like, Okay,
you went through the time and effort to create this
document that you think should be out there in the world.
What do I need to look for to ensure that
this document can meet all the criteria that it would
make sense to be published? And I'm thinking about as
the reviewer, what's the mission of the journal That is
(44:13):
actually always forefront of my mind, right, And I'm thinking
like does everything that happen in this manuscript make sense?
And if not, is where the things that are missing?
Things that can be repaired? And so when I write
my review, I want my feedback to be as kind
as possible. These are people who often have sunk years
(44:34):
into delivering this manuscript, right, Like, they have put a
lot of time and effort into making this, and so
like you want to be considered of the fact that
they put so much work into this, Like you don't
have to be a dick.
Speaker 2 (44:47):
Yeah, exactly, exactly exactly, don't be a jerk.
Speaker 1 (44:50):
You don't have to be Yeah, just like think about
like if you have constructive criticisms, say that. And so
like when I've gone and I've read papers where I
was like, this is almost unreadable. But the feedback I
put was like, there's the sentences that like you didn't
have a period. I'm like, I'm not sure where this
sentence ended. It seemed like it start it's got went here,
(45:11):
and then it went a totally different direction, and they
seem like there are two topics. I'm like, I'm thinking,
probably what happened is you moved a section of the
paper around and you just missed that there was a
thing that sort of then left not connected, and like,
I want to give you as much benefit of the
doubt as I can, but I'm like, I don't know
how to read this part and have it make sense
to me, and so I'm like, I'm just not sure
where that thing would go. So I would often offer, like,
(45:32):
this is if this is what you meant, this is
how I would suggest writing it, so that you can
communicate that idea clearly.
Speaker 2 (45:38):
Well. If I see that, I usually go like, the
entire paper may benefit from like spending a little bit
of time inform and style, like like making sure that
you have with somebody else review and catch these errors
just in case, to kind of clean that up a
little bit. Like I'll always give people a direction.
Speaker 1 (45:52):
Yeah, And at the very least you can always find
like here's a space if you're not sure what to do.
Put in an ad. We're back. Don't put ads in manuscripts.
Speaker 2 (46:05):
Don't put bad Yeah, don't put add in mainuscripts.
Speaker 1 (46:07):
That's bad news, They're bad. I think there's a lot
to say about it. Honestly, I mostly find the process
extremely helpful. I like being a reviewer. I really like
getting feedback from reviewers, Yes, but it is. I have
had some experiences sometimes where, you know, it always feels
good to me as a reviewer if when I write
my review and then the action editor, who basically is
(46:28):
like they're overseeing all the reviewers, they know who the
reviewers are, They're like responsible for managing their deadlines, and
then they take the reviews and they synthesize that into
a feedback document. They always add their own feedback as well,
assuming that there's anything to add. The action editor does
a lot of work. And when I'm one of like
two reviewers and the action editor primarily cites my review
(46:50):
in their like synthesis of the reviews and their feedback,
I always feel really good about that. Yeah, as a reviewer,
I'm like, I feel like I'm on the right track
with that. I've had some reviews where I read a manuscript.
This actually happened I think within the last year, and
I remember reading it and I was like, there's some
really cool ideas in here, and I think this could
be a really cool paper. I'm like, there are just
some parts of it where I'm like it just it
(47:12):
cant There's so many missing pieces that I can't synthesize
it into a hole, and I'm like, if you can
just fill in those gaps, you'll be great. And the
other person literally wrote there like, this is amazing, best
idea anyone's ever had. I think you might maybe had
a typo don't fix anything, publish as is.
Speaker 2 (47:28):
Wow.
Speaker 1 (47:29):
The action editor came back and basically just agreed with
me completely, And actually, I actually ed are greatly elaborated
on the points that I made. I felt good about
that because I was like thinking that, I'm like, I
really tried to like scrutinize this. I'm like, I wanted
to bring a level of analysis to this is something
that could end up in people's hands where they're reading
it and they're trying to use this to do something
with their field, their practice, their thoughts, they're going to
(47:51):
write about it as well or something. And I'm like, so,
if I'm really clear on this, then I feel good
about this going out in the world. But if I
feel like there's pieces missing, I'm going to suggests that,
like they you know, they bring some additional level of
clarity to the writing and whatnot. And that doesn't necessarily
always mean reject either, Like it could mean this is
like you're on the right path, please revise and submit again.
(48:13):
It will get accepted. And I would say that, like,
I think I've also submitted maybe six or seven somewhere
ballparking there, it's not very many. Like a lot of
people who are professionals, they published dozens of manuscripts, they
have a lot of practice. But like, I don't think
I've I've never had a manuscript get totally rejected. I
have had in the total summer I've had, I've had
reviewers recommend that, but the the overall output has always
(48:37):
been something like either revise and resubmit or accept with
revisions right those but that always means like they've always
made it to publication. It just took some time to
work into a better place.
Speaker 2 (48:51):
Yeah, yeah, let's talk about this then, Like so, so
here we are at the stage where you know, we
understand the peer review, we kind of heard the different types,
and the question that we would ask you at this
point in time is is this the best way to
go about actually scrutinizing and approving scientific research? And I
think there's probably some empirical question in there somewhere, but
(49:11):
I do think that like at the very least we
can talk about what it looks like scientifically and what
we can do going forward. So Accel and friends in
twenty twenty five, a very recent article actually discuss some
important points related to the current peer review process that
I think are worth evaluating. They cite specifically problems related
to a lack of reviewers and then further a lack
(49:33):
of qualified reviewers. They cite predatory journals, political and administrative
biases which we know to be true, and poor reliability
as general problems exist in the peer review process. So
what I'd like to do is take each one of
those and talk about some of those points and kind
of unpack them a little bit as we go forward.
So that's what we're gonna do right now.
Speaker 1 (49:52):
Okay, that sounds good. Yeah, So let's start with quality.
So apparently eighteen percent of accepted papers, meaning it went
through the peer review and made it to publication, they
were eventually found to have statistical errors. This was thought
to be a result of poor scrutiny and the peer
review process. Axl and friends go on to cite that
research outcomes were poorly described, conclusions were not accurate, results
(50:16):
were not reproducible in some cases a huge issue in
many particularly psychological fields. Yeah, and so it's worth asking,
like what's the impact here, And if there is an
increasing rate of retracted articles after publications, it just means
that those errors are making it through a filter that's
supposed to stop those errors from getting through, and that
(50:38):
does happen. Like the error rate unlike blog posts is
through the roof, probably over one hundred percent somehow, Like
it's that bad. So like having it down to eighteen
percent is definitely a lot better than just let's blow
everything up and let people do whatever. But it is
an unacceptably high rate. I think for considering that these
(50:58):
are intended, these should have gone through a process that
prevented those errors from happening in the first place, and
that the whole point of that process was to stop
these errors from making it to publication. So it's either
people were rushed or they just didn't spend the time
and do the work to really make sure that everything
made sense. But like I have actually gone through sometimes
(51:19):
and tried to like recreate the graphs or alternatively reverse
engineer the data that would have produced that graph so
that I could understand like the numbers that went behind it. Yeah,
when I was more ambitious in my younger years, I
did that. But like that can be a really great
step in ensuring that like no steps were missed in
(51:40):
the process.
Speaker 2 (51:40):
Yeah, yeah, yeah. We brought up predatory journals earlier, and
I think this is really important. It was reported that
as of twenty fourteen, there were nearly eight thousand predatory journals.
Whoa that published nearly half a million articles. Uh oh
so yeah, yeah, that's roughly. I did the math. It's
about sixty two to sixty three articles per journal.
Speaker 1 (52:01):
Oh my god, I know.
Speaker 2 (52:02):
So what does that mean? Well, it doesn't necessarily diminish
journals with good reputations, like the ones that have high
impact factors, that have good peer review processes, but it
does dilute the science pool. To do to researcher stating
that they would still and this was in the article,
they would still continue to publish in these journals despite
knowing they were predatory, and specifically there were researchers that
(52:25):
they cited that they were working in a space where
publication was a big deal. So one motivation for this
was that in the academic space, institutions abide by a publisher.
Parish viewpoint, which means that they're looking at the number
of publications, not the quality of publications. So predatory journals
are a really nice outlet to kind of boost those numbers,
like pump up those numbers without actually having the quality
(52:46):
behind it. Yeah.
Speaker 1 (52:48):
I like sort of guarantees that you can publish something
because all you have to do is pay they have
it published, and then it will be published, and then
you can say, look at how many publications I have.
Speaker 2 (52:55):
Yeah, I agree.
Speaker 1 (52:56):
You know, I understand why universities want to you have
some like I guess criteria for like a minimum threshold
of publications to like earn tenure and get raises and
sometimes even maintain your job. But I think that sometimes
it artificially incentivizes the wrong thing. Like I think what
they want to be incentivizing is people doing good scientific work,
(53:20):
but they're focusing on the output of that work and
not the work itself. And so it's like, rather than
you publish an article gets get cited a lot and
becomes a major thing that is used by people in
the field because it was so influential and so important.
You get articles that are just published because they could
(53:41):
be published, you know, you get quantity over quality sometimes,
And I do think that I most authors, I think
most people who are in these spaces are really legitimately
trying to do really good work. I think that you,
but you do incentivize, I think, toward the wrong thing
in a lot of these places. Right, So biases occur
(54:02):
within the peer review process as well. As we mentioned earlier,
this isn't just reviewer biases. Sometimes there are systemic biases
that can favor some authors, some topics, methods, institutions, or groups.
If you ever noticed a researcher with an abundance of
publications in a certain journal, This can be found in
conference presentations as well, simply implying that, like that is
(54:23):
a home that the author has found where they can
publish a lot. Yeah, because either they it's easy for them,
or they're well known because it is one of the
single blind review processes, or you know other There might
be other things, but important to understand that like systemic
biases can favor these and like, let's take for example,
there are these journals. Calling them scientific journals is being
(54:47):
way more generous than I'm willing to be. But they
are journals that publish onlike pseudoscience, things, so you get complementary.
In alternative medicine journals, for example, there's several of them.
There's there's a whole bunch of them that are out there.
They're very likely to accept any publication that comes across
their desk that makes complimentary and alternative medicine look good. Yeah,
if it is critical of what they're doing, then they're
(55:08):
less likely to publish it. And that might be the
case for a lot of a lot of journals, like
if they're not necessarily going to publish something that is
critical of the journal or the mission of that journal,
but like they're going to be very friendly. Like you
come in and you're like, look, you guys, powdered unicorn
horn is like the greatest medicine ever and all you
have to do is wish it into existence and then
(55:29):
you will cure cancer. And these journals are like, sweet,
that sounds really good. Let's publish that right, no notes,
you've got it on lock. And like there are a
lot of journals I think that have that sort of bias,
and so like if you publish something that is in
line with things they like to read, then you know
it might get published with very little scrutiny.
Speaker 2 (55:49):
Yeah, yeah, exactly. And the last thing that we'll bring
up to here is poor reliability, right, And this is
one of the greater concerns related to overall acceptance of
papers as poor science maybe get poorer science. Right, So
a bad article can go out and then somebody will
cite that article and it will continue forward until eventually,
now we have kind of like this whole of science
that is not actually good science because of that one
(56:11):
bad study to begin with. So a bad study can
perpetuate ongoing bad science and ultimately dilute the rigor of publication,
and rejected papers may get republished in other journals and
later win Nobel prizes. So there's another side of this where,
like good it's happened multiple times where like good science
was rejected as a result of poor peer review reliability. Right,
I'm telling you, like my article, like I brought up
(56:32):
the article about that reviewer too, that article has been
circulated thousands of times now because of the discussion paper.
But like we were told that it was going to
be rejected, and it was one of those things where
it's like it's had a pretty decent impact factor compared
to some other works. That I've done, and simply because
I think the topic is relevant and whatnot. But it
was one of those things that like, if it would
(56:53):
have been rejected, it wouldn't have gone out and it
wouldn't have influenced folks, right, yeah, yeah, And we also know,
according to currencyatistics that the reliability measures of peer reviewers
haven't really improved since the seventies. It's been pretty consistently
flat lined. And that's not to say that all peer
review is bad and that peer reviewers are terrible and
anything like that. We're not saying that, not by a
long shot. But it does highlight an inherent problem with
(57:15):
the system at large in that it is systematized, but
it's not structured in the way that it could be
to improve these processes. And there's and there's going to
be a couple solutions that we're going to talk about
in this next section.
Speaker 1 (57:25):
Well, let's go ahead and make that transition though, And
what better way to make it transition than with some
ads of course. All right, So, as we said, like
the system that we have is a good system, it
could be better, though, we think, yeah, it's certainly better
(57:46):
than how most information makes it out into the world.
It's because I think there is a there's an inherent
reason for this, which is that this is a lengthy process.
As we said, manuscripts can take years to make it
to review and then another year or more to make
it to publication. It could be less like sometimes they're
really quick turn rounds, but like the process of running
(58:07):
a manuscript can take a long time. And like when
you've got like information that needs to be disseminated, like
right now. This is one of the reasons that like
TikTok and Instagram are like some of the places, and
like Twitter and blue Sky, like these are some of
the places where breaking news happens. Is because it's happening
faster than the news can report it. Yeah, and like
information is power, and people seek out information and want
(58:28):
to know what's going on. And so anyway, this process
is very slow, but it is like the best process
for filtering out really good information. But as we said,
there is a better way, There is a way forward
through this current view process. We can do better, and
there are some proposed solutions in the current process that
might facilitate that.
Speaker 2 (58:50):
Yeah, So one of the first suggestions is open and
transparent peer review. The overarching goal of this type of review,
like we mentioned before, is that it can improve the
credibility of the review process by showcasing who the reviewers are,
what their feedback might have been, the entire decision process,
all the way through. It will make it so that
it's very clear and transparent on how we arrived this
decision and then leave it open for scrutiny from the
(59:13):
public or like other peers that might exist in the field.
So we do have that opportunity to do that as
part of the process.
Speaker 1 (59:18):
Another one, as we've sort of alluded to, is reviewer training,
and this is like getting people ready to be reviewers.
We train reviewers how to conduct a review. We can
start to establish some criteria by which we do reviews,
like what kind of things are you looking for in
the review, and that like when you are crafting your
review that you are making sure you are accomplishing. Journals
(59:41):
not journals, but journals can increase reliability and validity through
systematic measures and ongoing oversight of the reviewers. And it's
like there are kind of versions of this process now,
they're just they're kind of done in whatever style makes
sense to the people who oversee those processes now, and
it's not very regular it or systematized, and it's also
(01:00:02):
a little opaque, like we don't really know how those
decisions are being made all the time.
Speaker 2 (01:00:06):
Right right. We could also look at pre print peer review,
So pre prints are really great opportunities to share into
so many information about research prior to peer review. So
this is before that paper enters the peer review process,
and it's a great opportunity to share a study ahead
of time, receive public feedback about the manuscript prior to
formal submissions to publication, and then start kind of looking
at it from that perspective where you can actually start
(01:00:27):
doing the edits and get that feedback prior to any
sort of formal review from a journal that you're interested
in publishing in.
Speaker 1 (01:00:34):
I think we did a mini on pre prints, did we?
Or did we full a dful episode. We've talked about
it before.
Speaker 2 (01:00:39):
We did a mini on preregistration. We might have done, yeah,
but which is a different We didn't even talk about
it in this one. But like preprints are cool, like
you can write the manuscript and publish it and like, hey,
have at it before it ever goes anywhere.
Speaker 1 (01:00:52):
Right right, Yeah, yeah, yeah, that's a cool IDEA reviewer
incentives recognition and availability. There is a severe lack of
research reviewers, especially in behavior analysis, and it makes sense.
Again as we mentioned, you don't get paid. There's no
incentive for review besides love of the game. They could
consider some monetary compensation. It can take a long time
(01:01:13):
to do reviews. You might spend many hours on these.
One of the early reviews I did, where again I
was very ambitious, is I read every article in the
reference list, every one of them. Wow of the manuscript
because I wanted to understand the history of what research
line led to that manuscript being done. And I learned
so much. It was actually one of my favorite ways
(01:01:33):
to learn about a topic is I got to see,
like I read them in chronological order of like when
things were published. I had to see each new iteration
on that line of research. It was really cool. Yeah,
but it took a very very long time. So anyway, sure,
if journals could give some kind of monetary compensation, this
is a money free labor of love. Generally speaking, that
might help. There could introduce some bias and problems there,
(01:01:55):
so like it would need to be really careful with
that system and again, I think it would have to
be based on merit and quality, but not necessarily quantity
or result.
Speaker 2 (01:02:04):
Sure exactly. We can also look at recruitment processes. So
right now, most journals don't really have a formal process
for recruiting reviewers. I don't know about you, but I
got one day somebody went hey, will you review this?
And I went okay, and so like it, And then
now I was recruited as a reviewer. So like. It
wasn't like a process where I applied to be a
reviewer or had any of those situations I don't like.
(01:02:25):
I said, I don't know what your experience was, but
that was my experience kind of stepping into that space.
And if we can improve the criterion for who can
review and ensure there's a diverse reviewer pool, then the
quality of reviews can also improve. Right, So we have
people who can dedicate more time to reviews because they're
not reviewing as many per year. We've got diverse perspectives,
(01:02:45):
we've got people who are experts in different areas who
are willing to give time and expertise, Like, we can
do a lot of cool stuff if we have a
decent reviewer pool.
Speaker 1 (01:02:54):
That diverse perspective thing that might be. That's extremely important piece, Yes,
extremely important, And I think it's very easy to fall
into the pattern of like we're all trained the same way,
we all agree with each other, we're basically in an
echo chamber. No, not in science. We want a lot
of people who are thinking about those different ways, and
we're not necessarily all going to agree, and then they
(01:03:15):
might not necessarily make a meaningful contribution every single time.
But we want the diversity of opinions in there so
that we can learn from each other.
Speaker 2 (01:03:23):
Ye.
Speaker 1 (01:03:24):
That has got to be one of the most important features.
Speaker 2 (01:03:26):
I love that, Yeah, absolutely, all right.
Speaker 1 (01:03:28):
The last one we have here is a more transparent
editorial process right now. Again, the decision making process is
a little bit opaque. We don't really understand exactly how
people are making their decisions. In some instances, an article
may be approved but rejected by the editor, and while
there is usually some explanation, they don't often give formal criteria,
(01:03:48):
although they might offer like they might offer at least
specific details. Sure, and making the entire process more clear
and transparent might improve the credibility of the journal and
the editor and allow some critique of the decision making
process and reduction in bias by having some more information there,
Because if we can compare the decision making process against
(01:04:10):
like articles that are similar, where one gets published and
one does not, then we can say, like, where, like
what was the difference in here that led to one
being published over the other. Was it in the decision
making process because of bias? Like if somebody came at
this from a position they're like, I just don't like
these people, or I just don't like this topic, or
I do very much like these people, and I do
very much like this topic. So I'm willing to like
(01:04:32):
let some things slide, Like those are helpful things for
us to know, right so that we can do better.
Speaker 2 (01:04:37):
Absolutely absolutely so I think for me to like really
kind of drive this episode home, Like, I think it's
important to remember the peer review process is a process
that is a human invention, and a fairly modern one
at that, right, Like, it's a fairly new process in
the scope of science and humanity and all that in society,
Like all things, we're still learning how it works and
we're looking at ways to improve it. Thankfully, we have
(01:04:59):
the opportunity to be able to do this and to
improve this through some proposed solutions that we mentioned. And
we know that the peer review process is not perfect,
but and it is better than nothing, but it's also
pretty decent right now. Like if it's done well, it
can be pretty great. And so, like you know, I
don't think there's anything wrong with critiquing something that is
that exists in a field where we want to advance knowledge.
(01:05:20):
I think that we always have to be critical about that.
But I think the peer of view process can be
pretty good right now comparatively, and I think that we
can just make some tweaks and changes along the way
to make it a little bit better as we go.
Speaker 1 (01:05:32):
Yeah, I think it's reasonable to conclude that the peer
review is the best process we have right now, yes,
for disseminating high quality information. And that does not mean
it could not be better. It could be a better process,
possibly a lot better of a process, but it is
a very good process that we have that we we
(01:05:55):
scrutinize these works so that the information can be I
guess I want to say, evaluated for its quality as
much as possible that by the time it makes it
to publication is widely available. That we have said, like
we've done our due diligence here. If you read this,
you can trust that, like a lot of work has
(01:06:17):
gone into ensuring that the information you're reading is as
clear and as accurate as it could possibly be. Yeah,
and like that's that's a really important thing, like that
we can trust where our information is coming from and
do so without bias against like a lot of things,
Like it's clear that this is not advancing some specific
agenda when it's done through good scientific process. It's kind
(01:06:39):
of like, you know, we're going through this process right
now in the United States, where like we can't trust
government websites anymore. They are deliberately withholding They're like if
they don't like the numbers, they fire the people in
charge of those numbers and they get new numbers. They
are withholding information, they're removing valid information. There are countries
with dictators that are famously rated as like lacking democracy
(01:07:02):
and having violent human rights abuses, who are no longer
being called like being called out for their human rights abuses.
They're basically saying everything's hunky dory over there. Nobody looked
behind the curtain, please, Right, That's the kind of thing
that's going on the US government so like we can
no longer trust our government, and that's the thing that
the people here voted for. We voted so that we
can put people in charge who will wreck the systems
(01:07:22):
that we had in place to allow for accurate dissemination
of information to happen, and now we can't trust it, right,
and that sucks, Like that really sucks. So at least
we have the institutions of science to hopefully continue to
be the guardrails against these evil forces.
Speaker 2 (01:07:37):
Yeah, most definitely.
Speaker 1 (01:07:38):
All right, Well, we've gone very long and we still
have to recommend some things. In fact, if you're joining
us for the first time, you may not know this.
Matter fact, you wouldn't know this. Probably at the end
of each discussion, we make some recommendations that usually just lighthearted,
fun things. Maybe they're heavy, I don't know, but it's
just stuff. It could be books, it could be music,
it could be games, it could be shows, it could
be activities.
Speaker 2 (01:07:57):
Whatever.
Speaker 1 (01:07:57):
We recommend some things and it's usually not really to
our topic at hand, and that will be the case today.
We also have some listener mail to get to because
we've been having some really cool interactions with our listeners,
which we very much appreciate, and if you would like
to write us and tell us your thoughts on peer review,
your experiences with peer review, we're very happy to read
those things, and in many cases we're happy to share
those things in the form of a listener mail. You
(01:08:19):
can read us directly by emailing us at info at
wwdwwdpodcast dot com. You can also reach us on the
social media platforms. We just recently read a listener mail
that came to us that way. Man, We look forward
to hearing from all of you. If you like to
support us, then you can join us over on Patreon.
We get ad free episodes, early episodes, all that kind
of good stuff, and I will shout out your name
in a list of people who help support us that
(01:08:40):
we very much appreciate. So with that being said, here
are the following people who have earned our thanks. Thank
you so much to Mike m Megan, Mike T, Justin,
Kim Brad, Stephanie, Brian, Ashley, Kiara and Charlie. Thank you
for being there. You guys are really the best.
Speaker 2 (01:08:55):
You're the best.
Speaker 1 (01:08:57):
You can also support us by leaving us a rating
and a review. Lever you listen to podcasts, you can
do one or the other or both. But we appreciate
if you have some nice things to say that you
say them to help other people find us. You can
tell a friend, you can like or share or do
whatever that just helps get the word out. That is
actually one of the most powerful things you can do,
beyond of course just being a supporter, which is great.
Also you can go steal someone's phone and subscribe to
(01:09:19):
the podcast on their phone. That's also good too, and
then give it back to them and give it back
to them. Yeah yeah, I don't just steal things. But
also also very importantly, thank you to my team of people,
without whom I could not make this podcast. So writing
and fact checking from Shane and myself, our social media
coordinator is Emma Wilson, and our audio engineer who makes
everything sound like a podcast episode so that you can
(01:09:42):
listen to it and hear it and you can critique
it and review it and go about your day is
justin So thank you very much to my team of people.
Speaker 2 (01:09:49):
Y'all are the best.
Speaker 1 (01:09:50):
You're one of those team of people.
Speaker 2 (01:09:51):
Hey oh hey, yes, I am so are you look
at us?
Speaker 1 (01:09:54):
Is there a thing I'm missing? Before we move on
to our listener mailn recommendations.
Speaker 2 (01:09:57):
No, that covers it.
Speaker 1 (01:09:59):
I think just one ad.
Speaker 2 (01:10:00):
Oh, always with the ads.
Speaker 1 (01:10:10):
All right, we're back with Let's start with some listener mail.
All right, this one comes to us from actually one
of our patrons. You don't have to be a patron
to write to us, but one of our patrons did,
and so thank you to Brian for writing in. This
is a following up on our episode. We published a
nostalgia episode on our board games. Brian ran with some
(01:10:30):
further recommendations. He said, I wanted to throw some other
suggestions your way that weren't mentioned in case you haven't
heard of them or just haven't played. If you have played,
what are your thoughts? He mentioned a game called Puerto
Rico Say it's a great combination of mechanics including role selection,
engine building, resource management, and shipping and trading, all with
variable turn order and a following mechanic that keeps you
busy the whole time. You're assuming the role of a
(01:10:51):
colonial governor during the age of exploration. In Puerto Rico Septima,
you take on the role of a witch trying to
become the next leader of your coven divides simultaneous action selection,
engine building, area control, and resource management. There's a huge
emphasis on risk versus reward and social suspicion. There are
a lot of things going on in this game, but
definitely a fun one. Dune Imperium blends deck building and
(01:11:14):
worker placement, all said on the backdrop theme of Dune,
the book series Dune Yes, with a lot of the
sort of character cards are now based off of, like
the actors who play the various characters in that story.
He says, this one's great, has so much replayability. The
rise of Ix expansion is great. Spice must flow, and
then he says Isle of Cats rescuing cats from a
(01:11:35):
doomed island by placing Polyannimo cat tiles onto your boat
combines card drafting and tile placement as a fun family one. First,
let's say thank you so much for writing in This
is a really fun list. I am familiar with Puerto Rico.
I've never played it, but it looks really good and
I know people who love it as one of their
favorite games. Exceptim I've never heard of, so I'm very
curious about this. This witch is coven one Dune Imperium.
(01:11:57):
I have played, Shane, I would be This is definitely
on the much heavier side of like board games, or
relative to what you'd be familiar with, But I'd be
curious to see how you'd enjoy it given your familiarity
and history with the Dunes series.
Speaker 2 (01:12:10):
As I say, I just love doing so much that
I would I would sacrifice so much time to play it.
Speaker 1 (01:12:16):
It is like it's legitimately a good game.
Speaker 2 (01:12:18):
I played this game.
Speaker 1 (01:12:19):
I think that for me, the battle section of the
game is not the strongest, but the rest of the
game mechanics work really well. And it's not that it's bad.
I've just I played other games that I think handled
it a little better. But it is. I love deck building,
I love work replacement of those two combined it works
really well. It is a very very good game, So
I'm not trying to criticize it. I think it's great.
(01:12:40):
Isle of Cats I've heard of, but I've never played
it seemed like one of those games I didn't have
much strategy, But the way that Brian described it, I'm
actually like really curious now because I do like Polyonimo games.
I like having lighthearted, easy games that are like quick
and fun to play but don't require a huge amount
of setup or teardown. So those are those are appreciated.
So anyway, thank.
Speaker 2 (01:12:59):
You so much for writing in Yeah, that rules, thank you.
Speaker 1 (01:13:02):
And those are more recommendations for other people if they
want to check out those games as well. So Puerto
Rico and doing Imperium I definitely understand to be heavier.
I'm not familiar enough with Septima, but it sounds like
it's on the heavier side as well. I love Cats.
I think is probably a nice, light, easy chill game
for people who are not really in the hobby to
pick up and play. Yeah, yeah, for sure, But I
think that is what you have to say about that
for right now, let's go ahead and move on to recommendations.
Speaker 2 (01:13:25):
Oh so excited recommendations. So I'm going to recommend a
new zine publication. And the reason I'm recommending it is
because I wrote it, and so it's a zine. I'm
going to do a series of zines called Radical Care,
(01:13:47):
and it is really I was influenced by the book
let this Radicalize You about like doing something, and for me,
I know that my strength in advocacy is dissemination, so
like I feel like I do a pretty good job
seventy eighty the information, sharing information, making it palatable, making
it useful, and giving people direction. And so that's the
goal of Radical Care is taking some social issue that
(01:14:08):
has come up and giving information about that social issue,
giving direction and resources about that social issue, to really
paint a picture of why this is a problem, what
we can do to do better, and some actions to take.
And so the first issue is about homelessness and houselessness,
and so it was I started working on it when
we did our episode on this, and I was like, oh,
(01:14:30):
I think we could do something really cool. And so
finally I was able to finish it up and put
it together and it's like twenty five pages. It's a quick,
easy thing, and it's digital. It's free, and so we
make it so I'm making it so that it will
forever be free under every circumstances, every circumstance that we
can think of. So, yeah, we published the first one.
So Radical Care Issue one is about houselessness. I'm working
(01:14:52):
on issue too right now, and it will be about
the death penalty in capital punishment. So I'm going to
start tackling like pretty heavy stuff on that on that front.
Speaker 1 (01:14:59):
So all right, you communist well, that link will be
the link for this freezine will be in the show
notes for this. So wherever you're listening to this, if
you just look at the episode and you scroll down
to where the notes and section stuff are, I actually
usually put recommendations right toward the top. There's like the
description of what the episode is, and right under that
(01:15:20):
is are recommendations. So you should have a clickable link
there that will take you to this very cool I'm
recommending a board game as I do. This is one
that I have played fairly recently. It is a style
of mechanic that's pretty unique. It's called flick and right.
There are lots of something and right type games, most
commonly roll and right, and that's where you roll a
pair of dice or some amount of dice. And there's
(01:15:42):
also flip and write or also called draw and right,
where you have a deck of cards, you'll flip it
over and then you'll take some things. The way that
it works is you have essentially everyone has their own
sort of score pad, and you have dry erase markers,
and there's a board in the middle of the table
where the board has sort of big plastic side along
the edge of it, and you have these little wooden
discs you flick on to the board and you can
(01:16:05):
knock other people's wooden discs. And you're flicking five wooden
discs that all have a number one through five on them,
and then depending on where they land, you can use
them to score different sections of your scoring pad. So
there's an element of you trying to flick your disc
into a place where you can get it, to try
to knock people out of positions that you don't want
them to score very well, and then for you to
(01:16:27):
just strategically use your scoring disc to fill out your
scoring pad the best that you can. Super easy game.
You can play it, and once you're familiar with it,
I think you can play it probably twenty minutes, yeah,
because you just go a few rounds doing this and
then that's it. It was a very entertaining little Johns
where you're flicking discs onto a board and then trying
to fill out your score sheet the best you can.
(01:16:47):
Not particularly complicated. I think most people can pick this
up and play it. Some of the scoring and symbols
are a little weird. I think they could have done
a better job explaining some of the symbols on there,
but if you read through it and you like double
check where they reference those things, it'll be come fairly claim.
In the rule books, I think only four or five pages,
like for most board game rule books, it's pretty pretty light.
Speaker 2 (01:17:05):
Yeah yeah, and like in.
Speaker 1 (01:17:06):
Addition to like one of those pages is credits, and
one of them is if you want to play like
a variant of the game or something, so like it's
probably only three pages of rules, really.
Speaker 2 (01:17:14):
Yeah, yeah, for sure. All right.
Speaker 1 (01:17:15):
Anyway, so that is Sonora the Board Game. If you're
interested in checking out this wild flick and write game,
and then the new zine and by Shane Spiker, Radical
Care issue number one, Free to All, Free to All.
All right, well, we've gone very long, so we should
probably wrap this one up. I think we've said all
the thanks and all the credits and done all the
things that we need to do. Is there anything you
(01:17:35):
like to add or anything that I forgot before.
Speaker 2 (01:17:37):
We move on? Nothing today?
Speaker 1 (01:17:39):
All right, we've accomplished peer review and reform of peer review.
Here on why we do what we do. I am
el imming today. Thank you all for listening. This is
Abraham and.
Speaker 2 (01:17:50):
This is Shane. We're out Sea.
Speaker 1 (01:17:53):
You've been listening to Why We Do What We Do.
Speaker 2 (01:17:56):
You can learn more about this and other episodes by
going to ww D w w D podcast dot com.
Thanks for listening, and we hope you have an awesome day.