Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Kyrin Down (00:00):
There will always be someone selling b s, but you can learn how to spot
it.
Welcome, mere mortalites, to another round of the mere mortals book reviews. I'm your host, Kyrin, live on the 09/03/2025.
As you might surmise,
(00:20):
as indicated by the name, this is the podcast where I expose the lies of the big science industry, especially big banana.
Damn you, big banana. No. Okay. No. We're not gonna be doing that today, but we will see some of that in this book, which we have here, Bad Science by Ben Goldacre.
So this was published originally in 02/2008.
It's,
(00:41):
339
pages in length,
relatively
smallish. It's got a couple little graphs and things in here, so
won't take you too long to get through. I'd say it took me, like, six to seven hours reading in total, something like that. And it's an expose of sorts
on the media and in particular
(01:01):
individuals,
but also the media as a whole,
relating to distorting of statistics
statistics and science.
So,
in particular, science scientific data. So he'll explain the various errors that they commit in their analysis of studies of trials and things like this, how they're reported to the general public.
(01:25):
And then he also explains the,
I suppose, the underlying reasons why they might be committing this, which can involve some level of stupidity
slash,
ignorance slash,
malevolence and various
proportions of that according to the topic and the industry and things like that. So there is 16 chapters in total,
(01:50):
and they are labeled things like matter, brain gym, the progenium x y complex, which all were kinda, like, unhealthful. I had no idea what they meant at the start. But then it gets on to things like homeopathy,
the placebo effect.
He examines the media's MMR
hoax
as well as things like nutrition
and in particular particular doctors,
(02:12):
Killian Mc Jillian McKeith, PhD, and professor Patrick Colford,
two who are listed in the actual chapters themselves. So there's a a bit of callouts of actual individuals, but then also of organizations
and of,
particular topics. You will find some graphs and things like that within the book,
(02:33):
but there is not that many. It's mostly about the writing and quoting of people and things like this. So
what is
bad science, and what is the media's role in promoting it?
I guess you'd call this the educational section of the podcast, so, settle in.
And probably worth most of your time if you are an average layperson
(02:55):
in a sense,
you know, both both of those questions are,
are interesting. So I'll tackle the bad science portion, then we'll go on to the media,
portion of this book. So
on your screen right now, there's a list.
This is from an external source outside of the book, but it gives a very good, like, brief indication of some of the things you need to look out for,
(03:16):
when the scientific method is distorted,
misinterpreted
and analyzed incorrectly,
or when the actual method itself is not followed. So,
some of these are,
well, many of these are found within the book, and I'll link them to particular things he talks about. But just going over here, sensationalized headlines, misinterpreted results, conflicts of interest, correlation and causation,
(03:41):
speculative language, sample size too small,
no blind testing, no control groups, cherry picked results, unreplicable
results, all these sorts of things. So, what are some of the things we'll see within the book? Well,
one of the big sections right at the end, which was on the MMR vaccine
autism link. So this was
around the February
(04:03):
period where they,
there was this kind of hoax,
which he calls
and was propagated by the the media. This was particularly in Britain,
well, in England, I guess. And
it was related that, the three jab vaccines was called, causing
(04:24):
the autism in kids.
And they were
saying there was these links from this. So in particular, there was a study study by Andrew Wakefield,
which
analyzed 12 kids and showed, oh, okay, there's this kind of gut bacteria thing or something like this, and this is the link, etcetera, etcetera.
He had a rather big conflict of interest, and this was that he actually I think he owned a clinic treating autism. So it was very much in his favor to,
(04:53):
kind of be promoting this.
And there was some very ethical dubiousness on whether the 12 kids in the study
actually needed lumbar punctures and colonoscopies that they ended up receiving,
whether that was
medically necessitated by their doctors or whether he was kind of doing this to prove a point and, you know, further an agenda of his.
(05:15):
We have things like unreplicable results.
So this was the MRSA
superbug once again in England. Ben Goldacre is an English
author, so he's talking about things in his own country.
And this was essentially saying, oh, there's all of these superbugs in hospitals.
And so there was these investigative reporters going to places, going to their walls and like
(05:38):
in hospitals, taking some samples.
And
there was all these findings showing, oh, wow. Like, there's superbugs in all of these hospitals is terrible.
They're not keeping the hospitals clean. They're dangerous paces, etcetera, etcetera.
But in reality, they were just taking all of these results to one dude called,
what was his name? Christopher Maljuszczec?
(05:59):
I'm pretty pretty sure that's, Polish. Way too many zed's and w's
in that name.
And it was just some random dude in his
backyard
shed laboratory
who was finding all of these results, and they were unreplicable. If you took them to more,
standards, professional
laboratories,
they weren't getting the same things from the same samples,
(06:22):
hence indicating that perhaps the his sampling equipment was contaminated and, you know, he wasn't very thorough
in his process, etcetera, etcetera.
The media jumped onto this, though, of course, and
tasty headlines,
MS, MRSA superbugs, and so
unreplicable results. No. It doesn't really matter if we're getting good headlines and and sales.
(06:46):
This goes on and on. Cherry picking of data.
They talked about Linus Pauling and the link between vitamin c and the cold where he was picking
studies and and trials that supported his thesis from the get go.
Small sample sizes.
Did you know cocaine use in British children doubled over a year period?
(07:06):
But they was it was done on a yeah. Look, 9,000 kids is, I believe, actually a decent sample size. But
the questionnaire, how the trial was conducted
and, it wasn't, you know, like, randomized. They weren't doing these double blind studies, things like
this. It's so easy to pick out results and even things as simple as reporting that it went from one to two percent when in fact it was actually like from one point three to one point nine percent.
(07:33):
You know, the effect is halved from what they're saying. So you can't say it doubled. It was actually, you know,
twenty five percent up.
And this is a very common thing you'll see in in media
highlighting statistics and blowing things out of proportion
when it's like, yeah, sure, something may have doubled
in likelihood,
(07:54):
but the doubling could be from like one in fifty five million to one in,
what would that be? Twenty seven point five million. So
yes, it's doubled, but it's still very, very unlikely.
And this just goes on and on. I'm not going to be able to touch upon all of these things. Speculative language,
recent recent research, for example, showing that turmeric is highly protective against many forms of cancer, especially of the prostate,
(08:20):
which was just taken from this random,
you know, turmeric reacting with some sort of bacteria in a petri dish,
extrapolating that out and saying, okay, it's working in this way. And so,
yeah, you can it's highly protective when in fact to ingest the amount needed
by just doing some simple math, you'd need to eat 100 grams of turmeric, which is like it's like trying to eat 100 grams of cinnamon in the spoon. If you've seen those challenges of people putting that in their mouth and then just coughing it up. So,
(08:49):
all sorts of things, authority writing,
Jillian McKeith, PhD,
which was actually from a non accredited American correspondent
course, and she was selling pills and enzyme powder as a medical doctor on on kind of like what you'd expect the standard TV back in the in those days. This is the early two thousand, remember, which she's really talking about in this book. Maybe late nineties as well.
(09:13):
And there was just many more things.
Correlation or equal causation,
calculating statistics wrong.
Probably the worst
case he talks about in this book was Sally Clark, who
had a child with,
SIDS, two two children
who had SIDS,
sudden infant death syndrome. And I don't know that much about it, but it's a very rare thing to happen.
(09:35):
And they were saying it's so, so rare.
You know, one in three thousand eight hundred and forty five, I believe, was the the number. And if you multiply the chances of that happening, it's like one in, you know, fifty seven million or one in like twenty seven million
of this happening.
And this should only happen once in a hundred years. And this was presented in a court case that was against her.
(10:00):
And the doctor who did this,
one, did the math wrong, and it actually isn't even that likely because there's genetic factors and things like this. So if you have one child die from SIDS, it's probably more likely that you're gonna have another one.
Barring that, the more egregious thing was,
they presented this statistic in court, and therefore, the jury interpreted it
(10:24):
as, okay, there's a one in twenty seven million chance, for example, that, she is innocent.
Whereas, in fact, you could just do the same math and go, okay, what's the likelihood of
a,
a parent committing a double murder against their child? And it's like one in fifty eight million, one in one hundred million, even more unlikely than
(10:46):
two children
dying. So
the the way that people use statistics, especially
in a one off case and a non nuanced discussion or argument,
is egregious at the best of times. You always got to be careful with statistics when it's just a one single thing like that. And unfortunately, she went to jail for many years. Essentially, her life was ruined. She ended up
(11:10):
dying from,
I think was alcohol poisoning,
you know, at a very traumatic case
where her two children died and then she got blamed for it at the same time. So all sorts of things like this where you go, okay,
the
the science is hard. Science is very hard and bad science, which he's talking about in here is when things get misinterpreted,
(11:32):
when the trials themselves aren't conducted in an in an ethical manner or in a manner which is going to,
show
results or people come into things trying to prove something already if they've got an economic incentive.
There's There's all these sorts of things happening. So
what about the news section or the media
portion of this? Because
(11:53):
that's probably always gonna happen. There's science is very hard and there needs to be some leeway and a lot of nuance in this. And,
how should the media or how does the media report on these sorts of things?
And
look, I have long little personal section here. I've long derided
the mainstream news
(12:14):
and metal medical aspect for sure, but in general mainstream news,
for not being a truth telling machine.
My main reasoning is that the advertising model, which they largely rely upon,
especially nowadays compared to perhaps the past where subscriptions could,
sustain a newspaper. This is largely not the case. Advertising is the dominant and has been for a long time, just creates a perverse incentive where it's not about truth telling, it's about
(12:44):
making money, it's about getting headlines. And this is not a new thing. Attention economy. We've all heard this sorts of things. So
his section in the book on nutrition
highlights, I guess, some of the
intellectual grievances
which
are just as bad in terms of
(13:04):
why
things get distorted and why and why truth telling is not,
is what is not what is occurring, but in fact is something else. The newspapers or the media, the TV,
documentaries, all these sorts of thing, they're not showcasing
truth. They're showcasing something else. And what is this something else? Well,
(13:28):
another quick backstory here. I've I've got a bachelor's degree in engineering.
Did that a long time ago.
I did relatively well at that. And, this is actually linked to a portion, one of the chapters in the book, which is called,
why clever people believe stupid things. And
in a certain extent, you could call me clever because I got good results, but
(13:49):
I am definitely stupid in a many, many, many things. And,
I remember when I was doing probably the most scientific thing I've done in my life, which was writing my thesis,
for to graduate as part of my my degree.
And
the amount
of cherry picking of data of
(14:10):
trying to find something to
to be able to write about in my thesis, I was doing it on, like, mine seals for underground mines and how to best do that properly,
considering factors sort of like water, explosives, things like this, or,
explosions that can happen because of trapped up gas, pent up gas underground and structural earthquakes, all these sorts of things. And
(14:34):
I remember just doing this and I was like, this is not a great thing. What I'm producing here is not going to produce
stuff that is, I think, going to be useful for the world because
I didn't have
an academic
training per se and really the ethics of it. The what was being highlighted to be by my professors and teachers was not like
(14:58):
academic rigor. It was you got to produce something. And, you know, I was the final year of university. I wanted to get this thing fucking done because I was sick of being at university.
And so I look at this and I go, you know,
what,
what was the reasons for this? And
sure, I produced a piece of,
academic literature,
(15:19):
but it is not worthy of citation. It is
worthy of being ridiculed for cherry picking of data
of looking for to prove a hypothesis.
And it's fine coming in with a hypothesis, but
really trying to prove it and then
altering your methods and cherry picking and, you know,
looking using statistics to find things is not is not bueno. It's not good.
(15:43):
And thankfully, I was doing it in an industry where it actually didn't matter. And I'm pretty sure that thing has never been cited. I'm not even sure how you'd find it.
With all that being said,
what you see in this book is way, way, way worse than anything I ever did
by
a huge margin.
And so
(16:04):
what do we see with how the media presents
certain things and
individuals as well. So we can kind of understand individuals
doing things for
personal gain. Perhaps they want the fame, perhaps they are looking for money,
Perhaps they're trying to, you know, justify their their position, their role, you know, all these sorts of things.
(16:29):
That that's okay.
That I'm it's not okay, but it's it's understandable. You can see why this is gonna happen and how it's probably impossible to,
stop anyone from doing things like that. And I'm I'm very much more of the free will type stuff. People are gonna do things, you know, and mistakes will happen on a small individual level. But when you get into media organizations where
(16:52):
there should be levels to this stuff of, okay, this is perhaps more of an expert in the field of nutrition or writing on these things,
There probably needs to be like an editor going over this or in a fact checker. And,
you can
misrepresent things. You probably need lawyers looking over what you're saying and saying like, is this
(17:14):
justifiable
in terms of extrapolating from this data? Is this a reasonable thing? Just basic common sense
is terrible. And so what are some of the things we see from this? We see things like
antioxidants are good. So drink red wine and eating dark chocolate.
There was a story about trying to prove who had the
(17:35):
sexiest strut, and they ended up finding that Jessica Alba did.
And
the this was one sponsored
by a hair removal cream company.
They asked a doctor to come up with a formula or proof
to rank
this survey data that they've done,
(17:56):
And they wanted them to be showing that people like
Jessica Alba
and
who's the Latina singer,
Jenny from the block, whatever her name is,
to prove that they were had sexiest struts than someone like Kate Moss or
Amy Winehouse.
And so they got this doctor to come up with this, like, ridiculous formula
(18:21):
related to probably like stride length,
hip ratio,
probably something about legs so that the hair removal cream company could be like, hey, look at these sexy legs. You know, this is important.
That's the more humorous, comical side, if also just very highlighting how stupid the media,
can can do in promoting these things.
(18:42):
The more gross ones are the MMR, MRSA
scares, which kind of showed a coordinated effort
to not be seeking the
truth, publishing what will sell
and
kind of going out of their way to to misinterpret data or when there is pushback
saying, like, no, you're you're wrong. We're the media company. We're promoting this. This is and I very much do see it as promotion. And this just occurs
(19:10):
continually throughout the book. So,
I find it's funny whilst reading
all these different sections,
I found his outrage. And the book is written in a style where he's making fun of a lot of the,
these instances,
bringing it to light, showing like this particular person wrote about this particular thing and they're wrong. Here's why.
(19:34):
And
I found it more palatable or acceptable when he was talking about the media organizations as a whole,
fucking things up and
misrepresenting things,
in.
And I'm not really sure why I find that more
acceptable than compared to, like, your average nutritionist
on TV or homeopath
(19:57):
coming up with
equally ridiculous things,
related to, you know, this, you know, if you eat this one thing, it's going to improve your your chances of x by, you know, 500%
or whatever.
And
the
I think the reason because they're very similar in many ways.
Both contain misguided fools,
(20:19):
whether this be the homeopath or the journalist just writing random shit.
Both have a profit incentive linked to their misunderstanding,
whether they're selling newspapers or selling pills
or selling
their latest x y z book.
There's
I'm not sure. I think it's probably because I I feel there's a bit more malice
(20:43):
or intentionality
in the media,
journalists
and in particular
when they are
misreporting
scientific
data,
probably because they're part of a like a corporation. It feels there should be more checks and balances.
Whereas I feel the homeopaths
(21:04):
and people who are
along those lines are more
simpletons in a way where
they just they think they're doing good. They just don't
understand. Like, they're just so ignorant of what they're doing.
You know, maybe maybe I'm wrong in that. It's it's probably not the case, and I should perhaps reverse that,
(21:25):
perception as both can cause ill effects for their consumers. Perhaps that's the thing. You know, the the m r MRI
MMR and MRSA
hoaxes, misinterpretations,
caused people to do things which
had, I would say, a more negative outcome.
They,
wouldn't not get vaccines for their children, and hence, the amount of measles in the in The UK
(21:52):
went from essentially nothing to okay. There's now we've got like a serious problem here.
But you can also have the same thing if someone's saying like, oh, you just need to eat, you know,
eat dark chocolate and red wine, and those antioxidants will
cause you to live longer or something like that. Maybe they won't go to the doctor because of that.
(22:14):
Or if they're getting a homeopathic,
alternative
medicine treatment instead of, you know, drugs or some yeah. It's it's a it's a tricky topic. Tricky topic. So
I guess, like, summarizing this sort of section, what are his suggestions for helping
fix these problems related to bad science and media?
(22:36):
From what I can tell, it was more of a call to individual action,
as
structural changes probably won't take place.
And the individual action can take the part of the person actually doing the science and then someone more like me who's a consumer of information.
So
(22:57):
statistics are always going to get misinterpreted by the media
and promoted for that. The incentive model for a lot of media, I still think the advertising model
is inherently,
going to stuff you up. You're whether it be for an individual, small individual podcast like mine, or whether it be for something much larger, like an organization,
(23:22):
or,
like the New York Times, for example, or something even bigger, like social media media network, like Facebook or Twitter,
there's going to be problems from that sort of method. So
he is really saying like, look, you as a layperson
can understand these things. What we talked about, the misinterpreted
results, you can
(23:43):
do basic things to look at trials and studies and say, okay, was this actually done in a
semi correct manner?
The problem with that is it's still rather dense. You know, reading
trials and,
academic pace of papers are very dry. You know, I personally don't really try and do that. And then you can get into the whole
(24:04):
statistical side of things where it's like, is this our score correct? You know, did they,
were they cherry picking from these results? You know, was their multiplication
done in a in a serious way?
So
that's that's tricky.
He does have a kind of call to action for scientists, academics that if they see things in the media,
(24:31):
that is wrong, reaching out saying, hey, you've got this wrong,
causing a bit more outrage about it,
And then also being careful whilst talking to the media because they are,
looking for headline attention grabbing things.
And so
saying numbers more like
the amount of children taking cocaine in The UK rose
(24:55):
from twenty two to forty seven over this year rather than saying a 50% increase year over year is a smarter thing to to be able to say.
And then there's just danger zones of nutrition, health
when that comes into the media.
That's
it's a dangerous area to take things just just at face value. So
(25:18):
let's jump into the author some extra details.
Ben Goldacre was born in 1974.
He's a doctor himself, practitioner.
He calls himself a nerd evangelist.
So really promoting the cause of open science
practices and clinical trials,
publishing of data.
I neglected to mention this, but there was plenty of times where people were saying like, we found this data, but they're not actually publishing it in a medical journal or anything like that anywhere.
(25:45):
This was particularly related to like AIDS,
the AIDS crisis, anything that gets people worried and things like this. You're going to get
spruikers,
quacks, all these sorts of things.
His style is a real mix,
and he's written a couple of other books related to this bad farmer
statins
and just assorted journalism over the years.
(26:08):
He's got spunk. Let's put it that way. So
as you read in the book, he's
highly, highly,
critical of of certain people. It's kind of a, you know, the British sarcastic slash dry humor a little bit a little bit deadpan.
And
he he's he's, for example, says he doesn't like to pick on people, but he certainly does a lot of that in this book,
(26:33):
calling out bad actors mostly for their methods. And,
I think this is one thing I'll give him some credit for. Yes, he made fun of ridiculous statements and claims,
but when it came to actually saying, hey, this is how the science works,
this is what they got wrong, showing misinterpreted
statistics and things like this, he was a lot more serious. So
(26:55):
I think that that was probably a decent thing to do.
And
the book is funny in a way, but perhaps off putting and unhelpful,
if he presses one of your buttons. So,
it's certainly easy to have just something ingrained in you from birth, from children.
You know, one thing for myself, for example, that I learned as a kid was, you gotta get tons of calcium,
(27:21):
otherwise to for healthy bones. I have no idea if that's true. I have no idea if milk actually does matter to to drink when you're a kid or if it's even important for me now.
That's one of those things that certainly I was influenced by the media when I was a kid, and
that is still somewhat stuck in me in a way. And I have no justification
(27:41):
or reasoning for saying
if that is true or not. It's just something I've heard a long time ago. So
probably a critique that I have maybe on his style is I didn't like how he seemed to take joy in belittling the homeopaths in particular.
Once again, perhaps that's my bias and just thinking that they're they're more on the side of fools rather than
(28:05):
particularly
malevolent or actually causing bad outcomes.
He mentions he's had the experience of anger, accusations,
lawyers coming after him, things like this.
And so I could understand why he sees more of the the darker and negative side of people like homeopaths and nutritionists.
(28:26):
But when you're reading from the outside
and interpreting these people as misguarded
slash quacks,
with the old scammer here and there, it does seem to come across as rather mean and perhaps unnecessarily so in some ways.
But that's that's the style and it's going to attract pushback,
his ridiculing.
And I'll probably put this out here and address this concern that one might have because you could read this and go, okay, but he's he's just promoting
(28:54):
his book. He's promoting his,
newspaper and and
column. And perhaps he has a practice as well that, you know, he thinks if he draws attention himself, people will come to it or something like that.
And
I didn't get that from him. And
there's a couple of reasons for this.
(29:16):
He he also notes the opposite side of things. So he was more than willing to get into how
farm bad farmer. He's got a book called Bad Farmer about
the
large companies
doing bad practices
and and things like this as well.
Cherry picking data, statistics, all this sort of stuff. So
(29:36):
he he's more I definitely got the feeling he was more just like, I want the truth. I want the truth. And the scientific method is the best way about
gaining this, which is something that I personally believe as well.
He, for example, something that was maybe useful for for his point of view, his side, if you want to call it, was,
(29:57):
there was
after the MMR
hoax,
it then kind of reversed,
June, as newspapers tend to do. And many years later, it was now showing like this was all this one guy's fault. Andrew Wakefield,
he's being, you know,
investigated for unethical practices.
What a terrible guy. And the newspapers, this is the very same newspapers which were promoting him, you know, five, ten years earlier,
(30:23):
and then claiming, like, the MMR autism link is definitely refuted.
And
Ben was saying, look, that's also an overreaction. That's not helpful as well. There needs to be nuance. There needs to be
an understanding that these things aren't so simple. And he's
got a book, I think, which is called,
(30:44):
I think it I think you'll find it's more complicated than that, which is a
once again, probably a terrible book title to sell books, but probably the most
scientifically accurate and smartest way to present things. So,
although his style is provocative in many ways,
that was when it was directed at the foolishness and the science and and methods were,
(31:08):
much he was much more serious.
On an unrelated personal note, reading this as well, the before I get onto the summary,
how the placebo works is mysterious,
still rather mysterious. So the there is a
section called the placebo effect, which goes for 23 pages.
It's really interesting.
I'd always had a healthy respect for placebos,
(31:32):
and there was just a couple of things I learned in here,
telling how a,
when doctors know they are giving a placebo, so I. E. It's not double blind,
and
it it has diminished effects.
So when the doctor
should just be in their trial saying, like, here's this pill, it's going to help you. If they know the pill is just a placebo,
(31:57):
it has a diminished effect. So it's something about the doctor's
way of presenting it. Perhaps they're
smiling,
perhaps
they're
having other effects which are very
tiny
minutiae
when presenting something can have a noticeable scientific effect, statistical effect on patients and how they feel,
(32:21):
and how not only how they feel, but
how
their results in their blood and blood work and things show an improvement.
So
new medicines,
first hurdle is to beat the state of the art, but many struggled to simply beat the placebo effect as well. The amount of times just a sugar pill can be worked just as well as
(32:45):
millions, if not billions of dollars of research is staggering.
And,
even the
questioning of whether there is a role for an ethical placebo,
which is hyping up something that has no
benefit other than
the placebo, kind of like bootstrapping it even to a higher level,
(33:06):
is is an interesting concept and has given me more appreciation of the
power that our own mind and kind of positive thinking can have not only on our
our actions and internal feelings, but also
it physical results in the body. It's crazy.
So jumping on to the summary, similar books recommendations.
(33:28):
Look, I feel like I've done a bit of an average job explaining
what's within here because there were so many studies, there were so many instances of
newspapers doing things wrong, nutritionists doing things wrong, homeopaths doing things wrong, scientists doing things wrong,
journalists doing things wrong, and
all of the reasons that I've explained why they do it wrong. So,
(33:49):
I found some tidbits of this useful, but personally, I kind of already come to many of the same conclusions that the book is presenting. So,
once again, this could just be me seeking out things which already prove my own bias, if you if you wanna put it that way.
But I found it a useful reminder,
even if I kind of alternated
(34:11):
whilst reading between laughing at some of the things he was saying, then also just being eye rolling or going like, that's that's a bit bit mean and unnecessarily
so in my opinion. So
I think it's ultimately a beneficial book,
even if it is going to alienate some people. And
because it is fundamentally about the science and not the sarcasm or the witty jibes.
(34:34):
So
for example, if everyone had read this book before COVID,
it probably would have saved,
I'd say, like a significant significant
portion of people a lot of mental stress because that have been able to see through
a lot of
the fear mongering, the hype, the scares, and perhaps even
(34:55):
have gotten them to go, Oh, okay, some of this data is being presented in a way where it is misinterpreting
or it is
miscalculated
or there is an incentive of why they would be showing it this way. So
tricky, tricky topics that require nuance is probably what the this book is all about. So I am giving Bad Science by Ben Goldacre as a very solid seven out of 10. Yeah. Worth worth checking out at some point if you're into it. It's a popular science book, so probably a recommendation that's similar would be something like Richard Dawkins,
(35:29):
who I think is kind of similar. He's he's willing to call out and
make fun of
silly
what he regards as silly,
positions.
And
whether that is to your taste or not
is up is up to you. Like I said, I kind of alternated with this book.
All the other things that I could recommend, other books of medical type stuff I've reviewed on this channel before, like,
(35:56):
what was the Sadattha Mukherjee's one on cancer,
the Emperor of All Maladies and other just medical ones. They're less spicy. So this was a kind of a spicy book. So
be warned. Be warned. Let's jump into the last sections here. I mentioned previously no advertising and I take that seriously. So
(36:17):
this is a value for value podcast. Put all of this out here
for free and I just ask that you provide some value in return. Many different ways of doing this. The simple stuff liking, subscribing, commenting,
telling a friend about the main models book reviews or any of our other podcasts, checking out any of the other ones that we've got.
Links down below to our website and things like this.
(36:39):
You can also give me a book recommendation. What do you think I would enjoy? What would you like to see me review on this channel? And then finally,
a treasure money monetary aspect. If you want to help support the channel in that way, there's a PayPal link down below, or you could use a modern podcasting app meremortalspodcast.com/support.
Last last section here, what is coming up and a reminder that I am live. I do these live on Wednesdays at 11AM
(37:07):
Australian Eastern Standard
Time. Ask your AI what time that is in your time zone
and really enjoy when people jump in.
What is coming up? So I'm reading another Lee Kuan Yew book at the moment
and that's a thick one. So I'm probably going to have to find a small book to be able to read before next Wednesday, to be honest. So we'll we'll see because a couple of these books recently have been have been rather rather thick and I've had some other stuff going on, so I haven't been reading as much as I normally do. So,
(37:38):
yeah, that one's called from third world to first by Lee Kuan Yew.
That'll either be next week or the week afterwards. So that's all I know. And one probably still reading the Lord of the Rings and more of that's come out
soon. Long one here, but enjoyable. So thank you very much for tuning in for hope you got some value from this, and I really do hope you're having a fantastic day wherever you are in the world. Ciao for now. Cara now.