Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Welcome to Stuff You Should Know, a production of I
Heart Radio. Hey, and welcome to the podcast. I'm Josh Clark,
and there's Charles w Chuck, Bryan, and Jerry's here. Jerry's back.
Everybody looking well rested and sun kissed and everything, and
(00:22):
this is stuff you should know. She's like a beautiful,
juicy orange. That's right. That's right, Chuck, that's a really
apt description. Ready to be squaws. I wish I could
squeeze her, but we're still not squeezing. Uh no, not
in this pandemic. Are you crazy? You out of your mind? No,
(00:43):
no squeezing. Um Yeah, even Robert, Robert Plant wouldn't let
you anywhere near him. Okay, check figure out that joke.
Robert the squeeze my my lemon, the lemon song, No
until the juice runs down my leg. Yeah, that's the
(01:05):
lemon song. Right. No, I don't think so. I think
it is. I don't think it is, Okay, I think
it's a I don't think it is the lemon song, man.
I think it's a whole lot of love. It's whole
lot of love, all right. It's maybe the dirtiest thing
that was ever said in like a top ten song. Okay, regardless,
(01:29):
I'll just let the emails take care of this, I
really think. No, the Lemon song is ums I love
those lemons, right, No, the Lemon song is all about
how how how you have friends, like you want to
have friends, and like friends are good to have. Okay, yeah,
(01:50):
may be all wrong. No, I think it's a whole
lot of love. Yeah it is. I'm a dcent sure buddy.
All right, well I encourage you not to google the lyrics.
Then well we um we could ask our good friend
and um stuff. You should know writer Ed Grabmanowski the
grab Stir. Look at that because he is in a
band and has been for a while. We've mentioned it before,
(02:11):
space Lord, which has just a super cool Zeppelin esque
sound to them. Um and they just there and um
they just released a new single, which you can find
on band camp by searching space Lord. Not the space Lords. No,
not Vinny in the space Lords. Yeah space Lord. Just
(02:34):
look for space Lord with some cool uh graphics, and
that'll you know that's Ed. You'll know it's the Grabster
but uh yeah, good stuff. We also have a game
out that Trivial Pursuit made. Yeah, we should plug our
own stuff every now and then. We just did. Yes,
it is a co branded game with Trivial Pursuit from Hasbro,
(02:55):
and it is not a trivial pursuit game that you
are used to. It is a stuff you should know
game that Trivial Pursuit was happy to UH co brandwith.
So just what, I don't want his emails that are
like this, this this is a Trivial Pursuit. This is some
other different game. You're always worried about the emails, aren't you.
(03:16):
Don't you just ignore them? Let him roll off, roll
off my back, Like, I'm disappointed in you guys for this.
I haven't even listened to the episode, but I'm disappointed
about that. I just got one of those. Did you
see that It just rolls off your back? Yeah, yeah,
those are those are always great. I didn't listen, But
here's what was wrong. I read that person back. Actually,
(03:37):
I was like, we actually kind of did exactly what
you hoped we would do. And they're like, oh, sorry
for being presumptuous anyway, Oh all is forgiven. So, um,
we're talking today about bias Chuck, and I want to
set the scene a little bit because you know, um,
(03:57):
one of the things that I'm always like harping up
about is like the death of expertise, right, And it's
a real problem, like this idea that science can't be trusted,
that people who go and spend a decade or more
learning about a specific thing that they go out and
become an expert in or that's their profession, that's their training,
(04:19):
um that those people what they have to say is
basically meaningless, or that it's it's no better than somebody
on the internet's opinion about so that specific subject that
that person spent ten or twelve years being trained to
be an expert in. Like that kind of stuff to
me is like super dangerous, Like it it's it's in
a there's an erosion of something, and it's a an
(04:40):
erosion of intelligence to start with, but it's also an
erosion of just believing in facts and knowing that you're
not being taken for a ride or hustled. And is
it a huge enormous problem that we're just beginning to
wake up too and is still unfolding. It's not like
it happened and now we're like real from it. It's
(05:00):
still happening in real time, and it is a massive,
huge issue one of the biggest issues that that humanity faces,
I think because it encompasses so many other large issues
like climate change, existential risks, the pandemic, um politics, all
of them kind of fall under this this erosion of
(05:23):
belief and facts and that there are people out there
who know more than you do. Um, it's a big problem. Yeah,
imagine being someone who studied and researched something intensely for
ten or fifteen years, uh that with when presenting facts
to be met with. I don't know about that. That's
(05:43):
a response I hear a lot in the South. Yeah,
or that they saw something on YouTube that flatly contradicts that,
and it's like that it doesn't matter what you just said.
Is ridiculous that somebody posted something on YouTube and that
that like that has as much weight is what somebody
who spent ten or twelve year studying this very thing
(06:04):
has to say about it, like knows exactly what they're
talking about, has to say about it. It's it's maddening. Yeah,
there's there's something about people from the South in general,
and I think that are in this group that I
have literally heard that response from a lot of different
people when I've been like oh no, no no, no, here
the facts actually, and then when presented with something that
(06:26):
they can't refute, they say, I don't know about that,
and like that's it. That's the end of the conversation.
That's different than the people I've encountered. The people I
encountered like their brow furrows and they start pointing fingers
and their their tone goes up, like you are you
hanging out at the country club or something. I think
it's different types of people that, you know, there's ignorance,
and then there's also people that actually think they're better
(06:49):
informed that will fire back with YouTube clips. Right. So
the reason I brought that up is because and one
of the reasons that that is being allowed to exist,
that that does exist, I think it's a it's a
reaction to something else that's going on simultaneously, which is
there are a lot of experts out there who are
(07:09):
performing really sloppy science, sometimes outright fraudulent science, and they're
frittering away whatever faith the general public or society has
in their expertise and in their profession. And there are
a ton of scientists out there. I would say the
(07:29):
vast majority, by far of scientists are legitimate upstanding, upright
dedicants to science, right, That's where they that's where they
place there, that's where they hang their hat, that's where
their heart is, that's that's what they believe in, and
that's what they work to support. But science has like
kind of a problem, Chuck in that it's allowing way
(07:52):
too much for bias, which is what we're gonna talk about,
to creep into science, um and your mind science and
basically produce papers that are just useless and trash. And
there's a whole lot of reasons for it, but it's
a it's something that needs to be addressed if we're
ever going to get back on a footing with a
(08:13):
faith in experts and expertise and just facts that there
are such things as objective facts. Yeah, I mean a
lot of times it's financially related, whether it's a lack
of funding, a desire for more funding, a desire just
to keep your your uh, your lab running and people
paid on staff. Which you know, all this stuff is understandable.
(08:35):
You want to keep doing this work, but you can't
let that get in the way. It's like it's like
in Rushmore at the end when Margaret Yang faked the
results of that science experiment because she didn't want it
to be wrong. You know, I don't remember what um,
I don't remember that part was that like a deleted scene? No? No, no,
(08:55):
was it in the end when they meet up and
he's flying the Uh. I think he's flying the kite
with Dirk and she's talking about her science fair project
and he was really impressed with it, and she was like,
I fake the results. And the reason why was because
because she didn't want to be wrong. Uh. And I
think a lot of times people will get into a
certain body of research or data too because they want
(09:19):
to prove a certain thing and if they can't, it
might be really hard to live with that. So that
weighs into it. Uh, money for personal gain, uh, advancing
your career, you know, publisher, parish, that whole thing. Like,
we're gonna talk about all this, but there are a
lot of reasons that it's been allowed to creep in,
But all of it is at the disservice of their
(09:40):
the fundamentals of what they base their careers on to
begin with. Yeah, it's at the it's at the disservice
of science itself, right, because the whole point of science
and then the scientific publishing, the whole publishing industry, um
is to to basically create a hypothesis, test your hypothesis,
and then share the results with the world. And that's
I deally what would happen because you're building this body
(10:02):
of scientific knowledge. But money and corporate interests and academic
publishing have all kind of come in and taken control
of this whole thing, and as a result, a lot
of the stuff that gets published are trash papers that
shouldn't be published. A lot of the really good papers
that don't come up with sexy results don't get published.
(10:25):
And then, like you said, um, people using science for
personal gain. There are a very small chadra of thoroughly
evil people who are willing to use their scientific credentials
to create doubt in the general public, to to prevent
(10:46):
like people from understanding that climate change was real for
twenty years, or um, that fossil fuels actually do contribute
to to anthropogenic climate change. But what remains focusing on
is like bias in the in the sense that people
carrying out studies are human beings, and human beings are flawed.
(11:11):
We're just flawed, and we bring those flaws to our studies,
and that you really have to work hard at rooting
those flaws and those biases out to produce a really good,
thorough scientific study with good reliable results that can be
reproduced by anybody using the same methods. Um and that
(11:32):
science is just starting to wake up to the idea
that it is really biased and it needs to take
these into account in order to to progress forward from
the point that it's at right now, which is tenuous,
I think, perhaps more tenuous than ever the point that
science is at. I think. So science isn't going away.
It's not going anywhere. It's probably the greatest of course
(11:54):
humans have ever come up with. Right, It's not going anywhere,
But it is a terrible position that it's in, and
it's going to take some genuine leadership in the scientific
community from a bunch of different quarters in a bunch
of different fields to basically step up and be like, guys,
this is really bad and we need to change it now,
and a lot of people need to be called out.
(12:15):
In science typically shies away from naming names and calling
out by name fraudulent scientists because scientists seem to like
to um suppose the best in people, which is not
always the case, right, And having said all of this,
there could we could root out every bias and and
and really clean up the scientific publishing community. And there's
(12:40):
still a set a certain set of people in this
country and in the world who that wouldn't matter to
and would still shut down facts and because it doesn't
fit their narrative, so for sure, But the people have
always they've always been there, right, and they're always going
to be there. There's always it's just countrarians that they
are you can call, and free thinkers you can call them, stubborn,
(13:01):
you can call purposefully, purposefully ignorant who knows they're always
going to exist. The problem that this crisis that science
finds itself in right now is that it's allowed that
that population to grow and grow, and like people who
otherwise didn't never really question science have been allowed to
kind of trickle into that fold. And that those are
(13:24):
the people that we should be worried about, the ones
who would would know better if they believed in science again, right,
And our way into this is to talk about different
kinds of biases in true stuff. You should know fashion
a top ten that is not a top ten. That's
exactly right. We ate into at least three in this
intro and hopefully shining a light on some of this stuff.
(13:45):
People at least be more aware of different biases. And uh, well, yeah,
you know. The first one is is good old confirmation bias.
I mean, these aren't ranked because confirmation bias would probably
be number one as far as people's awareness of it.
But there are different examples, um that people use for
confirmation bias. And I kind of enjoyed the one from
(14:06):
the house Stuff Works article even though it was from
N three. After X rays were discovered in Germany, there
was a French scientist named Renee Blond Blonde Lot. Yeah,
he looked at as the X rays he said, wow,
Well who said, ay, I see N rays I've discovered
in rays And everyone's like, what's an N ray? He said, well,
(14:28):
it's like a corona when electricity discharges from a crystal
and you can only see it in your peripheral vision.
And American Robert Wood laid the wood and said, I'm
gonna come to your lab and check this out and
secretly remove the crystals during one of the experiments, and
Blonde Lot still saw these N rays and so that's
(14:52):
confirmation bias. He wanted to see those in rays. And
then later, even though it was disproved, other French scientists
supposedly published papers or published papers based on that research
because they wanted it to be true. So that's what
confirmation biases is when you're starting out with a hypothesis
that is going to shape the methodology methodology of your
(15:14):
study to to confirm it right. And then it can
also occur where you're um. You you're interpreting info um
to fit your hypothesis, so you're seeking out stuff that
supports your hypothesis, and then the stuff that you is
that's just there in front of you, the results, they
are there in front of your like, ah, this thing
proves that those end rays actually exist, um or this
(15:37):
phenomenon cannot be due to anything but end raise. Therefore
end Raise exists all of its confirmation bias. And that,
like you said, that's number one, because that's not just
a scientific bias. I mean, like that is that's like
every human uses confirmation bias, and that it's two fold.
We we avoid contradictory information because we UM I don't
(15:59):
like to be wrong, and we find information that confirms
our point of view because we like to be right.
That's it's confirmation bias, and it's everywhere among everyone. That's right,
Although I will say I know it happens a lot politically,
but myself and the people that I congregate with, uh
(16:19):
question their own leaders as much as they do leaders
from the other parties. Oh, it's good, it's very good
to do. And I don't know, there shouldn't be sacred
calves and politics. That's a bad jam, well know. And
it's like I've always being been like at the forefront
of calling out my own parties wrongs and saying no, no, no,
(16:40):
that's you need to do better than that, whereas I
see a lot of other people in other situations truly
bury and ignore those things because they don't you know,
I just don't want to face that. Yeah. And it's
not like it's not even like I don't want to
face it. It just doesn't fit their worldview, so they
just don't include it. It It just gets tossed out. But
(17:02):
the point is is like it's not an active process necessarily, right,
I think we should probably check our first break. I
think so too. Chi All right, we'll be right back
and talk about sampling bias right after this alright, Chuck,
(17:38):
we're back, and we're coming back with UM, something called
sampling bias, which is it turns out a sub type
of a larger thing called selection bias, and one other
thing we should say. We kind of got into it
before I could say this. There are different stages in
a study where bias can occur. It can happen in
like the planning the pre study phase, UM and and uh,
(18:02):
it can happen during the actual study, and then it
can happen after the study as well. And so when
we're talking about any kind of selection bias, including sampling bias,
this is pre study bias where when you're actually setting
up the study, this bias is where this is where
the bias is going. Yeah, and you know what, I
think it also bears saying that biases you have to
(18:23):
work really hard to avoid it because it's it's almost
like a disease that's always trying to get involved. And
it's not like just do better, everybody and quit being biased.
It's like it's way more complicated than that, because it
is always knocking at the door, like you said, in
all three phases, trying to sneak in there, and it
takes a lot of work in all three phases to
(18:44):
avoid it. So it's not as I don't want it
to come across this as easy as us just saying
like you shouldn't do that, stop it. No, But the
first step is to recognizing that there's a lot of
bias and different kinds of bias that are just sitting
there waiting for a scientist. And then if you start
admitting that it's there, you can start being on the
(19:05):
lookout for it, and you can start adjusting for it,
and then other people who read your papers or here
you know, read news articles about your papers, can be
on the lookout for that kind of thing. Yeah, so exactly, Uh,
sampling bias is your you know, your sample set not
being accurate and a good representation of the whole. A
lot of times you'll find this um in either studies
(19:28):
that are really small scale because you don't have a
large sample and you don't have the kind of money
like near you. Like maybe you work for university, so
you work with university students as your first sample set,
who are not indicative of anything, but you know, people
eighteen to twenty one years old or so. Now, remember
we talked about weird Western educated and realized rich and democrats.
(19:52):
That's exactly the thing. It's like, I mean, it's a
it's a decent place to start if you don't have
much money and you want to get the ball rolling.
It's not like, oh, you shouldn't do university studies at
all and using students, but those findings definitely don't represent
the wide nation and it needs to grow and get
more funding if you want to actually have a legitimate
(20:13):
claim to something. Another way that sampling bias can come
up is from like the group that you're recruiting from.
Like if you're doing a strictly online survey, but you're
trying to apply your findings to the wider society, that's
just not gonna happen because there's so many people who
aren't Internet savvy enough to take an Internet survey. Like,
(20:35):
by by nature, you are a little savvier than the
average person if you're hanging out on the Internet and
taking a survey. And then also kind of tangential that
I like to tell myself that at least um and
then tangential that is something called self selection bias, which
is where the people who say, let's say you're doing
a study on wellness and you know what eating tuna
(20:58):
can do for your health. Um, people who are interested
in wellness and health are going to be much more
likely to volunteer for that study than people who couldn't
care less about health and have no desire whatsoever to
further sciences understanding of what makes you healthy. So you
would have to go out and find those people and
recruit them rather than just relying on the people who
(21:21):
volunteered based on the flyer you put up in the
student right, or you know, study all financial demographics or
pull all financial demographics rather than just and you know,
sometimes it's a methodology in which they do try and
recruit people steers them in that direction unknowingly that I know.
In the article they've talked about the presidential campaign with
(21:45):
Roosevelt and alf Landon Republican alf Landon, they were doing
polling with like country clubs, rosters and uh, people who
drove cars and stuff. At the time that was kind
of a luxury, so it was all out of whack.
Everyone's like Landing's gonna win in a landslide. It's because
you just kind of basically stuck your polling to uh,
(22:06):
you know, I don't know about wealthy individuals, but people
who are a little more well off. And I think, um,
we talked about that in our polling episode, that is
that fiasco with polling. I also saw one more two
that I want to mention because that has a really
great anecdote attached to It's called survivorship bias, where when
you're studying something, say like business or something, you're you're
(22:28):
probably going to just be looking at the extant businesses,
the businesses that have survived twenty years, thirty years, fifty
years or something like that, and you're not taking into
account all of the failures. So when you put together
like a prognosis for business in America, it might have
a son your outlook than it should, because all you're
looking at are the ones that managed to survive and thrive.
And that's survivorship bias. And did you see that The
(22:50):
anecdote about the World War Two fighter pilots. It was
actually pretty funny because they studied uh planes that had
been returned, that had been fired upon managed to get
back safely, and they're like, well, let's look at all
these different bullet holes and where this plane was hit,
and let's beef up all those areas on the body,
and a mathematician named Abraham Wald said, uh, no, those
(23:14):
are the places where they got shot and did Okay,
what you should really do is find these planes that
actually went down and beef up those sections of the planet.
And that survivorship bias. It's just it's failing to take
into account the failures and that have to do with
what you're trying to study. What about channeling bias. Channeling
bias is another kind of selection bias. Did you get
(23:37):
this one? It wasn't the best example of channeling bias. Yeah,
I mean that's I got it, all right? Did you
not get it? I got it, but it took a
lot of work before I finally did. Well. It's basically
when let's say you have a patient and they're like,
their degree of illness might influence what group they're put into.
(23:58):
So if a doctor or if a surgeon was trying
to study outcomes of obsticular surgery, they might because they're
surgeons and they want to help people out, they may
perform that surgery on maybe younger, healthier people who might
have better outcomes than someone who is in a different,
like higher age group. Right, And the the article is
(24:21):
kind of ends it there, and I was like, so
what's the problem. And I finally found this example where
where it says like, Okay, let's say you're studying a
new heartburn medicine or something or something to treat like
like gurd and it's new, it's hardcore, it's cutting edge UM.
(24:42):
And the people who are likely is to get this
new hardcore UM and acid are the ones who are
probably in worse shape, right, so, say they're on the
verge of going to the e ER. Anyway, well, if
you look back at all of the people who have
ever been prescribed this new hardcore and acid, you're gonna
see like a lot of them ended up in the
(25:02):
e R, even though it had nothing to do with
this hardcore and acid. And then similarly, the people who
have so so gird it's not particularly bad, they'll probably
be prescribed the old drug, the standby that everybody knows
it's fine, that's going to work. So if you compare
the old drug and the new drug, it looks like
the old drug is super safe, but the new drug
will put you into the e R. Whereas UM, that's channeling.
(25:25):
You've channeled different different people with different prognoses into different groups,
and they're kind of pitted against each other, um in
a in an effort to obscure the truth. If you
wanted to really know the genuine health outcomes for that
an acid, you would have to give it to people
with not so bad gurd and people with really bad
gurd and see what happens. See if the e ER
(25:46):
visits continue for people who wouldn't otherwise be going to
the e R Everyone, right, and not just for you,
Like if you're debating surgery and you're like, oh, well
it says shows really good outcomes. We're like, well yeah,
but who are they operating on right? Right? Yes? So
(26:06):
I would like to invite anyone UM who got what
I was saying or got channeling because of what I
was saying, I invite you to email and let me know.
I'm doing like a little bit of surveys here, and
I'd like to know if I confuse things more or
make it more understandable. Well, but I know it's funny
either way. I got that part. I'm just trying to
(26:27):
figure out its understanding. But here with your methodology talking
about the stuff you should know. No listener who by
nature is smarter than your average beyer. Well, I'm not
going to publish it. I'm gonna file draw it either way.
Oh what a teaser. UH question order bias is the
next one that is uh. And this is mainly obviously
(26:49):
with UM when you're just doing uh, like polling and
stuff or like an online survey or you know, it
could be just asking people a set of questions like
in a social science research set, and the way you
order things can affect the outcome. And this is the
thing at all, like everything from the brain's tendency to
(27:10):
organize information into patterns to the brain uh, simply paying
attention more and being more interested early on. Like I
know there was one The General Social Survey was a
big long term study of American attitudes and in they
were asked to identify the three most important qualities for
a child to have, and they were given a list
(27:31):
of these qualities. Honesty was just listed higher on the list.
When it was it was pick sixty six percent of
the time. When it was further down on the list,
it's forty eight percent important. And that's simply because people
are just reading this list that are like important and
then yeah, by the time they got down, you know,
three quarters of the way through the list, they started
(27:52):
thinking about what they're gonna have for dinner. People get
pooped when you're giving them lists of stuff, or you
can people and get them all sort of worked up.
Like if you have a question like, uh, during the
Trump administration, how mad were you at that guy about
stuff he did? And you're like super mad? And then
(28:13):
you were like, well, how did you feel generally about
how your life was affected during his administration? You might
say it was awful, Whereas if they hadn't have asked
that first question, they were just like, what was your
life like from two thousand and I am blocking out
the dates. What you might say, well, you know, it
wasn't It was okay, man, I had a lot of
(28:38):
sandwiches just over those four years though. Um. Yeah, so,
and like you said, that's priming, which is a big
it's a big thing that you have to worry about
when you're doing any kind of survey. UM. So there's
some of the ways that you can combat that. You
can randomize your question order. Um. Sometimes you'll have a
survey where one question is predicated on a previous question.
(29:00):
So one thing you might want to do is UM
ask that set of questions in a few different ways. UM.
So that yeah, so that you can kind of maybe, um,
compare the answers to all three, at them up and
divide them by three and there's your average answer kind
of thing. Um. There's a lot of things you can
do to kind of, I guess, um, manipulate to de
(29:24):
manipulate your respondent when you're doing a survey like that, manipulate, manipulate.
Look it up. You won't find anything on it, but
you could look it up still. Oh it's a Roxy
music album. Wow, Chuck, Wow, that was great. What's next?
(29:47):
So with question order bias, we've entered the during the
study kind of bias, like this is this is why
you're actually conducting the study, and so is interviewer bias. Um,
an interviewer bias. It's kind of like, well, question order
biases has to do more with the study design, but
it's a bias that emerges during the study interview or
(30:08):
bias just straight up is in the middle of the study,
and it has to do with the person actually asking
the questions in an interview. UM. It can also I
think apply to somebody conducting uh, like the like a
a clinical trial and a drug. If they know whether
somebody is getting placebo or not, it might affect their behavior.
(30:31):
But ultimately what it is the the person who's wearing
the white lab coat is influencing the outcome of the
of the study just through their behavior, through their tone
of voice, through the way that they're asking a question.
Sometimes it can be really overt and like let's say, um,
like a super devout Christian is is, you know, doing
(30:53):
a study on whether how many what part of the
population believes Jesus saves? And they might be like, you know, Jesus,
don't do you think Jesus saves? Is the question? Don't
you like? It seems like it? Huh, that kind of
thing would be a pretty extreme example, but it's it's
sometimes how you understand things is in the absurdities, you know. Yeah,
(31:14):
I thought this example in the House of Works article
is kind of funny. It was about just like a
medical questionnaire where the interviewer knows that the subject has
a disease that they're talking about, and they may probe
more intensely for the known risk factors. And they gave
smoking as an example, and it said so they may
(31:35):
say something like, are you sure you've never smoked, never,
not even one. Like, if I heard that coming from
a researcher, even without knowing a lot about this, would say, what,
what kind of a researcher are you? Like it seems
like you're looking for an answer you should you should
say you are ethically compromised, or even facial expressions or
(31:57):
you know, body language. All that stuff weighs in I
don't know, tybrow, Like, why don't they just have the
robots Alexa or Google or Syria or somebody ask them? Well,
that's one good thing about something like an Internet survey
is like it's it's just questions, and as long as
you design the questions and you randomize their presentation like it,
it's gonna be fairly helpful in that respect. But then
(32:18):
it's kind of its own pitfalls and pratfalls. You can
attract a lot of people who are just taking it
to mess with you, and there's a lot of problems
with with all of it. But again, if you're aware
of all the problems, you can plan for him. And
then even if you can't plan for him or control them,
you can write about it in the actual study and
be like this this study I remember running across studies
before where they're basically like, there are you know, there
(32:41):
was a kind of bias that we couldn't we couldn't
control for, so we can't really we can't really say
whether it affected this the outcome or not. And I thought, wow,
this is really refreshing and like even daring kind of
like I was thrilled. Um, But you don't see that
very often. But that, from what I am understand, is
the direction that science is going towards now well, and
(33:03):
the reason you don't see that. And then something we'll
talk about is is what actually ends up getting published.
The it may be less likely to get published if
they're like, hey, you know what, you know what I'm saying? Yeah,
I know. So let's do recall an acquiescence bias because
they're very much related, and then we'll take a break.
That's our plan. What do you think of it? Everyone
(33:27):
says it sounds good to me, all right. So this
is also during study, and this is so in the
very much the way that an interviewer can influence the outcome,
the participant can actually influence the outcome too, especially if
they're being asked questions or they're they're being asked to
self report. Um, there's a couple of ways that us
(33:47):
just being humans can foul up the works on on
the findings of a study. The first is recall bias. Yeah.
This is when you're obviously you're trying to recall something
from the past. And it's amazing what might jump out
at you from your past when probed with a certain question,
certain correlations that really have nothing to do with it
(34:09):
with you. But you may be like, oh, well, you
know what, now that I think back, I remember around
that time I was I was watching a lot of
Dancing with the Stars. I kind of binge that show.
So maybe that's why I had homicidal tendencies. I don't
think you need to study to prove that. I think
that's just intuition, you know. Yeah. Um but yeah, so,
(34:32):
and if enough people do that, especially if there's something
out kind of in like the zeitgeist about that, how
like people who watch too much Dancing with the Stars
want to kill other people? Um, like, a number of
your participants might recall that thing, whereas other people who
don't watch Dancing with the Stars aren't going to recall that.
(34:53):
And so in the same way that um, survivorship bias
influences that those people who don't have that memory to
recall that memory can't possibly be included in the study results,
which means that Dancing with the Stars is going to
kind of percolate to the top as like a major
risk factor in homicidal tendencies. Right, that's not good. You
(35:16):
don't want you don't want Dancing with the Stars unfairly canceled.
You want to be canceled because it is terrible. I've
never seen it. I'm sure it's great if you're not dancing.
I haven't either. But watch we're gonna be asked to
be on Oh my God, and they'd have to change
the name of the show to Dancing with the mid
level Internet famous right exactly. Wow, dust off my jazz shoes.
(35:43):
It would be us in Chocolate Rain. And you know
I love that guy. Tjon Day is his name. Yeah,
we actually met him that time. I remember it was great. Man. Um.
Another thing, Chuck that people that has to do with
recall biases. That, um, like, we just tend to have
fault to your memories with stuff that makes us look bad,
(36:04):
like say, unhealthy habits. So if if you're doing a
study on UM junk food and and health outcomes, and
you interview a bunch of people who are in terrible health,
and all of them are like, only like cheese. It's
like once in a blue moon or something like that
in the researcher, right, so once in blue moon cheese.
(36:26):
It's um like the results of the study, you're gonna
suggest that it takes just a very small amount of cheese.
It's to put you in the hospital with long term
chronic health conditions. And that's that is a problem with
recall bias, Like it is. It's it's the participants affecting
(36:46):
it in this case because they just aren't paying attention
to aren't really thinking about No, you you've eaten a
lot of cheese. It's and it takes a lot of cheese.
It's to put you in the hospital, not a very
little amount. It's not the best example, but it kind
of gets the point cross I think now, is this
part of acquiescence bias. No, that was that was the
end of recall bias. Acquiescence biases, it's different, but it's
(37:08):
certainly related. Both of them kind of fall under an
umbrella of participant bias. Yeah, and acquiescence bias. I feel
like there's the opposite too. I just don't know they
have a name, um, because acquiescence biases is generally like
people people want to be agreeable, and they want to
answer in the affirmative, and they want to especially they
(37:29):
found um, if you are maybe less educated, you might
be more willing to just go along with something and
say yeah, sure, yeah, yeah, um to maybe appear smarter
or just to be more agreeable. I do think it's
the opposite can happen too, especially with political um research
in social studies and that I think there are also
(37:51):
people that are like, oh you're from the what well, yeah, sure,
I'd love to be interviewed, and then they go into
it with a sort of opposite mental where they're completely
disagreeable no matter what anyone says or asked. Yeah, I
didn't run across that, but I'm absolutely sure that that
is a bias out there. But you can avoid these
by doing it more smartly, right, more smartly. Yeah, there's
(38:15):
ways that you can, Um, you can frame your questions,
like like people don't like to to admit that they
didn't actually vote American Democracy, so in instead of saying
there was a there was a pew a pew suggestion
p um where they said um, rather than saying like,
(38:35):
did you vote in the last election. A lot of
people who didn't vote are gonna be like, sure, yeah,
of course, why why would you ask that um? Instead
you would phrase it as like, um, in the two
thousand and twelve presidential election, Uh, did things come up
that prevented you from voting? Or were you able to vote?
And you would probably actually want to train your researcher
(38:57):
to use that same intonation to make it seem casual
either way, Like you want to give the person a
sense of comfort that they're not being judged no matter
how they answer. A good way. That's a good way
to get away around acquiescence bias. Absolutely, yeah, the old
backdoor policy. That's right, where you can squeeze a woman.
(39:23):
All right, are we taking a break? I think we're
We're mandated by the SEC to do that. After that joke,
all right, well, we'll be back and finish up with
our final two biases right after this. I also I
(39:56):
want to apologize to all the parents who listen with
their six year olds these days. My daughter is sick.
She doesn't care about what we do. That's great. So
it's still just flying overhead, right, I mean she didn't
even listen. She like movie crush a little bit. Well,
some kids there are six listen and hey, shout out
to all of you guys. Listen. No, I'm always whenever
I see that, whenever someone writes in says their kid
(40:18):
my daughter's age actually listens, I'm like, really, oh, yes,
this is my daughter. She loves it, right, Yeah, and
she voted in the last election to like, uh, my
daughter likes to watch videos of kids playing with toys
on YouTube? Kids? Is she into that? Now? Those are
the worst videos. I'm starting to get on her now
with just in terms of taste. I'm like, hey, you
(40:40):
can watch something, but like watch something with a story
that's good. It's like, this is garbage. I like it.
I can totally see your saying just like that three
defiant and happy impression. Alright. Publication bias is one we
we kind of poked around it earlier a little bit
with the whole publisher parish mentality. Can I add something
(41:01):
more to that real quick before we get into publication bias,
you you know, mind to what to the to just
talking about publication in general. So I don't think that
it's fully grasped by most people. It certainly wasn't by
me until really diving into this, that the academic publishing
industry has a stranglehold on science right now, in a
(41:25):
very similar effect that twenty four hour cable news had
on like journalism, to where it was like it became
this voracious beast that was willing to just spit out
money constantly feed it in exchange for yeah, feed it,
give me more stories, give me more, give me more pundits,
give me like that was the rise of pundits. Pundits
(41:47):
didn't really exist prior to that. They just hung out
on the editorial pages of newspapers. And then twenty four
hour news came along, and there's not possibly enough news stories,
like good news stories, to keep going for twenty four hours,
so you have to talk about the news stories and
analyze them, and then you start getting into who's wrong
and all that stuff. The publishing industry is very much
like that now, where it's this beast that must be fed,
(42:09):
and so there's there can't possibly be that many high
quality scientific papers. So scientific papers have just kind of
dipped down in quality, and then um, one of the
other things that the publishing industry has done is said,
we really like like studies that have results. They're called
(42:30):
positive results, where like it turned up that that you
found uh correlation between something, or the compound you tried
on that tumor shrunk the tumor. Like those are what
we're interested in, the whole furthering of science with positive
and negative outcomes. Just to say this did work, this
doesn't work, don't bother trying it. We don't care about
(42:52):
that kind of stuff. And that's a huge issue for
the scientific community, Like they have to get control of
the publishing community again, um if if they're going to
come out from under this dark class. Yeah, I mean
they found a in two thousand and ten a study
about papers in social sciences especially, we're about two and
a half I'm sorry, two point three times more likely
(43:14):
to show positive results then papers in physical sciences. Even
so some some bodies of research or even more apt
uh to publish positive results. And that means if you're
going you know this going into your profession, and you
know this going into your set of research, and it's
(43:35):
you know, that's when it becomes sort of put up
or shut up time. As far as standing firm on
doing good work even if it doesn't get published, right,
and so that that confirmation bias can really come in
where you start, hopefully inadvertently but certainly not in all cases, inadvertently,
(43:56):
start cherry picking data to get a positive outcome where
there really wasn't one there before, or you use a
kind of a weird statistical method to to suss out
the correlation between the variables so that you can have
a positive outcome. Because if you're not publishing papers, like
your academic career is not progressing and you can actually
(44:19):
like lose jobs, so you need to be published. The
publishing industry wants your paper, but they just want positive outcomes.
So a high quality, well designed, well executed study that
found a negative outcome to where they said, well, this
compound we tried didn't actually shrink the tumor, that's that's
going to be ignored in favor of a low quality
(44:41):
paper that found some compound that shrunk a tumor just
because they like positive outcomes. It's ridiculous. Yeah, And I
mean that kind of goes hand in hand with the
last one. You know, there's a lot of overlap with
these and a lot that work sort of in concert
with one another. And file draw bias is you know,
it is what it sounds like. It's like you you
got a negative outcome, and whether or not you were
(45:03):
being funded by a company that definitely doesn't want that
information getting out there, or if it's just as a
result of it being less likely to be published because
it doesn't have a positive outcome, you just stick it
in the file drawer and it goes by by, right,
and again, like part of the point of science and
scientific publishing is to generate this body of knowledge. So
(45:24):
if you're about to do a study, you can search
and say, oh, somebody already tried the same exact thing
and they found that it doesn't work. I'm gonna not
try to reproduce that. I'm just gonna not go with it,
um and move on to try something else. It's a
huge waste of resources otherwise. And then also you could
you can if you aren't publishing that kind of stuff, Um,
(45:45):
you're missing out on well, I mean you're missing out
on the real data. If the bad data is vile
drawered like you're missing out on the truth. You're missing
out on the whole picture, right, And also Again, it's
not just that the poor negative outcomes they need to
be included to. Yes, that's true, but you're also promoting
(46:08):
positive outcome studies that actually aren't good studies. There's this
thing called the proteus effect where the initial studies, these
initial papers on a subject um in sevent cases, a
follow up study that seeks to reproduce them can't reproduce them.
They don't come to the same finding, the same conclusions,
(46:28):
which suggests that a study was really terrible. Um, if
it can't be reproduced, or if it's reproduced, somebody comes
to a different finding, different conclusion, that's not a good study.
So the idea of publishing positive and negative outcomes together
would definitely kind of slow that whole crazy twenty four
hour news cycle. Positive outcome study. I don't see how
(46:51):
it's even legal to bury not bury, but I guess
just not even just a file drawer, a study suddy
that included like a drug having negative effects like and
I know that Congress is stepped up to try and
pass laws too. I think there was one UH in
(47:11):
two thousand seven requiring researchers to report results of human
studies of experimental treatments. Uh, and then they tried to
strengthen that in sixteen. Basically this like, you know, even
if your drug doesn't come to market, like, we need
to have these studies and the results. Like, how, how's
it even legal? It seems like you're bearing and it's
(47:33):
almost falsification. Well, it is, for sure, because you're also
like if you're if you're talking about studies where you
have multiple studies on say one drug that's an antidepressant,
and all you're doing is publishing the ones that have
positive outcomes for that antidepressant, and you're just not publishing
the ones that that showed no outcomes or maybe even harm,
(47:55):
then yeah, that should be illegal, especially when you're talking
about something like an antidepressant or in the biomedical field.
But it's certainly unethical for any field of science in particular.
Just bury the stuff you don't like that doesn't support
your conclusion. It's a kind of a meta form of
um confirmation bias, just putting aside the stuff that doesn't
fit your hypothesis or your worldview, and um just promoting
(48:18):
the stuff that does. I saw one one way around
this is the Lancet, you know, the very respected medical journal.
I think it's British. The Lancet um has taken to
um accepting papers based on the study design and methodology
and goals. So when you first plan your study and
(48:38):
you have it all together before you ever start, that's
when you would apply to have your paper studied and
published in the Lancet, and that's when they decide whether
it's a high quality enough study to publish or not.
So then they're locked into publishing your study, whether your
outcome is negative or positive, and has the knock on
effect of the Lancet basically being like this is this
(49:00):
is a trash study. We would never publish this, don't
even bother. So it's saving funds and then the high
quality studies are the ones that are going to get published.
And then also the positive outcomes and the negative outcomes
get get published regardless because they have no idea what
outcome is going to be because they accept the paper
before the paper, before the study has even been conducted.
(49:20):
I saw another thing that said that uh paper would
be more likely to get published in the Lancet if
it had cool illustrations. That's right, That never hurts. Everybody
knows that that's not unethical, especially in color it just
put a few of those New Yorker cartoons forget about it.
Everybody loves those. Are you got anything else, I've got
nothing else. You know, this is a little soapboxy, but
(49:42):
this is something that we believe in. It's it's kind
of like our episode on the Scientific Method a little bit.
Mm hmm. I like it too. Yeah, thanks for doing
it with me, man, Thank you for doing it with me.
Thank you for squeezing my limits. Sure. Uh, if you
want to know more or about scientific bias, there's a lot, fortunately,
(50:03):
a lot of sites and great articles dedicated um to
routing that stuff out and to make you a smarter
consumer of science. And so go check that out and
learn more about it. And since I said learn more
about it, it means it's time for a listener mail.
You know. Sometimes the listener mail dovetails quite nicely with
(50:25):
the topic, and that was the case today with our inclusion.
Oh yeah, on the Media Bias List, which was pretty exciting. Yeah,
what an honor. You know, there's something called the media
bias Is it called the media Bias List? I believe so.
And it's you know, what it does is it takes
news outlets and newspapers and you know, TV and stuff
(50:47):
like that, and they just sort of it's a big
chart where they're ranked according to like how biased they are,
you know, kind of up down, left and right. And
they included podcasts this year they did, and we were
on the US and it was really kind of cool.
We had a bunch of people right in. And this
is from Nicholas Bett. Oh, he said, I found this
post while I was scrolling through Facebook, uh and waiting
(51:08):
for the NFL season to start. Add fonts media? Is
it fonts or fontests? We should know? I'm not sure
one of the two. It's a it's a watchdog organization
known for the media bias chart. Um. They do a
media bias chart where they rank every news outlets political bias,
and in the recent update they included you guys, and
wouldn't you know it the most politically fair piece of
(51:30):
media you can possibly consume and all of the known
universes stuff you should know. You guys say you're liberal.
But until I heard Chuck outright stated, I didn't even
know wow that well, I think I think it slips
through their son. Well, yeah, we're certainly human beings and
we have their own biases, but we definitely try to
(51:51):
keep them in check. Is we try to, And I
think it's just been confirmed because they're not. Just like,
listen to a couple of shows and I want to
see these guys seem okay, like they really listen, then
they really rank people. Um. They probably saw that too,
or perhaps they listen to the North Korea episode where
Josh suggested wolf Blitzer apply hot paper clips to a
center thighs while writing a nice piece on Trump's Curreyan relations.
(52:13):
Hilarious either way, Thank you guys for your fairness and
hilarity of all these years. You're both the best. That
is from Nicholas bad Oh. Thanks a lot of Nicholas.
Thanks to everybody who wrote in to say that they
saw that. We appreciate it. Uh. And it was neat
to see ourselves right in the middle of the rainbow.
Love being in the middle of that rainbow. I do
to Chuck. It's nice and warm and cozy in there,
isn't it. Yes, Well, if you want to get in
(52:35):
touch with this, like Nicholas and the Gang did, you
can send us an email to Stuff Podcast at iHeart
radio dot com. Stuff you Should Know is a production
of iHeart Radio. For more podcasts My heart Radio, visit
the iHeart Radio app, Apple podcasts, or wherever you listen
to your favorite shows. Two