Studies of scientific papers reporting animal experiments have revealed many flaws in their design, which are generating considerable concern, not least among funders of research (1). These include, but are not limited to:
- Poor experimental design and risk of bias, even in high-impact journals (2, 3), in particular lack of statistical power (4) and lack of blinding (5).
- Artefacts caused by extraneous environmental factors, such as effects of animal age (6), cage conditions (7, 8), concomitant subclinical infections (9), food/water restriction (10, 11) or the sex of the experimenter (12) or animal (13).
- Poor compliance (14, 15) with guidelines for reporting animal experiments, including lack of details about anaesthesia and analgesia (16, 17).
- Poor reproducibility of animal studies (18-19) when a model is moved from, for example, academic environments to pharmaceutic industry. This was the subject of a seminar organised by funders in the UK in 2015 (20).
- Lack of translatability from animals to humans (21, 22).
- p-value hacking (also called data dredging, data fishing, data snooping, data butchery).
- HARKING (Hypothesising After the Results are Known).
- Publication bias.
References
- Gandevia (2016): We Need To Talk About The Bad Science Being Funded
- Macleod et al. (2015): Risk of Bias in Reports of In Vivo Research: A Focus for Improvement
- Bara & Joffe (2014): The methodological quality of animal research in critical care: the public face of science
- Wasserstein (2016): The ASA Statement on p-Values: Context, Process, and Purpose
- Bello et al. (2014): Lack of blinding of outcome assessors in animal model experiments implies risk of observer bias
- Jackson et al. (2016): Does age matter? The impact of rodent age on study outcomes
- Toth (2015): The influence of the cage environment on rodent physiology and behavior: Implications for reproducibility of pre-clinical rodent research
- Swetlitz (2016): Lab mice have a chill, and that may be messing up study results
- Szilagyi (2016): Parasitic infection may have spoiled zebrafish experiments
- Norecopa (2009): Guide to fasting rodents
- Bradbury & Clutton: (2016): Review of Practices Reported for Preoperative Food and Water Restriction of Laboratory Pigs (Sus scrofa)
- Sorge et al: (2014): The scent of a man: Gender of experimenter has big impact on rats' stress levels, explains lack of replication of some findings
- Spain: (2014): Females Ignored in Basic Medical Research
- Omary et al: (2016): Not All Mice Are the Same: Standardization of Animal Research Data Presentation
- Carbone & Austin: (2016): Pain and Laboratory Animals: Publication Practices for Better Data Reproducibility and Better Animal Welfare
- Coulter et al. (2009): Reported analgesic administration to rabbits, pigs, sheep, dogs and non-human primates undergoing experimental surgical procedures
- Bradbury et al. (2016): Pain management in pigs undergoing experimental surgery; a literature review (2012–4)
- Kelly (2015): More scrutiny is needed of irreproducible science
- Kaiser (2016): If you fail to reproduce another scientist's results, this journal wants to know
- The Academy of Medical Sciences (2015): Reproducibility and reliability of biomedical research
- Graham & Prescott (2015): The multifactorial role of the 3Rs in shifting the harm-benefit analysis in animal models of disease
- Groenink et al. (2015): European Journal of Pharmacology, Special issue on translational value of animal models: Introduction
See also the section in the PREPARE guidelines on statistical power and significance levels
Anthony Rowe (2022) has written an excellent set of recommendations for improving the use and reporting of statistics in animal experiments.
The REWARD Alliance website was created to promote a series of papers on this topic in 2014 in The Lancet, to help increase the value of research and reduce waste.
It has been estimated that 85% of research is wasted, usually because it asks the wrong questions, is badly designed, not published or poorly reported. While this primarily diminishes the value of research, it also represents a significant financial loss: an estimated US$ 240,000,000,000 were wasted in Life Sciences research in 2010. However, many causes of this waste are simple problems that could easily be fixed.
Other references
- More resources in the section of the PREPARE guidelines on experimental design
- The p value wars (again) (Dirnagel, 2019)
- Rein in the four horsemen of irreproducibility (Bishop, 2019)
- Extrapolating from animals to humans (Ioannidis, 2012)
- Reducing waste from incomplete or unusable reports of biomedical research (Glasziou et al., 2014)
- A Survey on Data Reproducibility in Cancer Research Provides Insights into Our Limited Ability to Translate Findings from the Laboratory to the Clinic (Mobley et al., 2013)
- Four erroneous beliefs thwarting more trustworthy research (Yarborough et al., 2019)
- Reproducibility: seek out stronger science (Baker, 2016)
- Big names in statistics want to shake up much-maligned P value (Chawla, 2017)
- Introducing Therioepistemology: the study of how knowledge is gained from animal research (Joe P. Garner et al.)
- The importance of being second (an Editorial in PLOS Biology (2018) acknowledging the value of complementary studies which replicate others)
- Threats to validity in the design and conduct of preclinical efficacy studies: a systematic review of guidelines for in vivo animal experiments (Valerie C. Henderson et al., 2013)
- Ten common statistical mistakes to watch out for when writing or reviewing a manuscript (Makin & Orban de Xivry, 2019)
- Reproducibility vs. Replicability: A Brief History of a Confused Terminology (Plesser, 2018)
- Still not significant - a blog by Matthew Hankins (2013)
- How we edit science part 1: the scientific method (Tim Dean, 2017, The Conversation)
- How we edit science part 2: significance testing, p-hacking and peer review (Tim Dean, 2017, The Conversation)
- How we edit science part 3: impact, curiosity and red flags (Tim Dean, 2017, The Conversation)
- How we edit science part 4: how to talk about risk, and words and images not to use (Tim Dean, 2017, The Conversation)
- How we edit science part 5: so what is science? (Tim Dean, 2017, The Conversation)
Thanks for your feedback! Please note that we cannot reply to you unless you send us an email.
What are you looking for?
We value your feedback so we can improve the information on the page. Please add your email address if you would like a reply. Thank you in advance for your help.!
Please contact us by email if you have any questions.