Richard Harris is a long-time, well-regarded science reporter for National Public Radio, so one has to wonder how he (or the publisher) came up with the title of his new book on the current state of biomedical science: Rigor Mortis: How Sloppy Science Creates Worthless Cures, Crushes Hope, and Wastes Billions. Why is it that so many non-fiction books these days have a short, dramatic title intended to catch your eye on an airport bookrack, followed by a subtitle with an alarming description suited for a checkout-line tabloid? Perhaps I just answered my question. Rigor Mortis is itself a play on words: the medical term refers to the stiffness of a body observed in death; here it indicates that rigor in science is dead. I agree with Harris that there are some fundamental issues in the practice of science that need correction, but it would be unfortunate if Harris’s criticisms are used in support of a retreat from science.
Harris estimates that about half of the U.S. government’s $30 billion annual investment in biomedical research yields results that are seriously flawed and can’t be trusted. This is a remarkable conclusion, but Harris is able to build a somewhat believable case. He focuses mainly on laboratory-based, preclinical, biomedical science, rather than clinical research that involves human subjects. Preclinical research is financed mainly by the U.S. National Institutes of Health (NIH), some foundations, and companies working to develop early concepts for new products that will be used for the diagnosis or treatment of illness. He spends very little time noting the remarkable advances made in biomedical science over the last few decades.
Chapter One describes a remarkable study done by C. Glenn Begley who headed up cancer research at Amgen, a pioneering biotech company based in California. After a long career at Amgen, Begley wanted to explore why his group often was unable to reproduce the finding of preclinical studies published by various academic groups. Begley worked with the original scientific groups that produced 53 landmark studies and, in a partially blinded manner, tried to reproduce their positive findings that could lead to more effective drugs. They could reproduce the original exciting findings from only six (11%) of the 53 reports. When Begley and his academic colleague Lee M. Ellis, a specialist from the MD Anderson Cancer Center in Houston, published a commentary on this body of work in the prestigious journal Nature in 2012, it created quite a stir in the scientific arena. Many scientists pushed back in public on the finding of a “reproducibility crisis,” but Harris reports that others admitted this phenomenon was widely acknowledged to be true. Harris goes on to discuss a variety of methodologic steps, some very minor, some fundamental, that are either poorly documented or done sloppily that could lead to the inability to reproduce a finding. Such methodologic weaknesses can result in uncertainly about the validity of any study’s findings.
The book proceeds to give similar treatment to other key areas where there are systematic weaknesses in scientific processes: mix-ups in the cell-lines used for preclinical research; the inappropriate use of animal models to predict treatments for human diseases; inadequate use of controls; weak study design and interpretation of data without a prespecified hypothesis (data dredging); and the publishing of data prematurely due to the pressure to publish (or perish). The quality of Harris’s documentation and analysis is far better than the title of the book might suggest. His prose flows well and is easy to follow.
Harris does a good job describing the perverse incentives that exist for NIH-funded researchers to prioritize winning their next grant, rather than conducting science that will yield the greatest health impact in the world. Pharmaceutical companies are better than academics at making tough decisions. In a competitive for-profit world, you want to fail quickly, so you can move on to what might be the next winner. In academics, whatever gets funded looks great, without full regard for the potential to yield a product that will be important for human health. Harris describes a “broken culture” that encourages scientists to sift through their data and publish those that will be most appealing to high profile journals rather than presenting the full breadth of their findings.
I find it is hard to assess how generalized or problematic this phenomenon is, as it is certainly appropriate to publish the most important finding from research studies. The peer review process, while imperfect, is a solid way to assess the quality of proposals and publications.
At the end of the day, there are some real issues here that bear scrutiny and self-reflection from those that conduct, support and fund biomedical research. It is noteworthy that some of the solutions to the problems identified in laboratory science are already in use in human clinical research, including the routine use of controls, standardization of laboratory testing procedures, rigorous “good clinical practices” and having a predetermined analysis plan before the data are generated.
Harris discusses the emerging field of metascience, the study of science itself. This promises a way forward to improve processes and practices. What Harris does not adequately discuss are the potential hazards of new “big data” science. We are experiencing an explosive growth in access to huge data sources and the generation of biologic data through human gene sequencing and characterization of the human microbiome. Unless hypotheses are generated with rigorous analysis plans, the scatter-shot analyses of these vast datasets will almost certainly yield a large amount of uninterpretable results.
It would be most unfortunate if the criticisms of science made in Rigor Mortis lead to a further retreat from science we are already seeing in the U.S. government (see my Stand Up for Science blog post). After all, the ultimate goal of science is to find the truth. While there certainly are aspects of biomedical science in need of review and revision, we should not reject science. The process of how we do science is constantly evolving, and Harris identifies some fundamental issues in need of correction.
Photo credits: Mockup psd created by Ydlabs – Freepik.com, Background image created by Creativeart – Freepik.com, Basic Books