Medical historians have shown that improved nutrition, sanitation, and other behavioral and environmental factors—not anything learned from animal experiments—are responsible for the decline in deaths since 1900 from the most common infectious diseases and that medicine has had little to do with increased life expectancy. Many of the most important advances in health are attributable to human studies, including the development of anesthesia; bacteriology; germ theory; the stethoscope; morphine; radium; penicillin; artificial respiration; antiseptics; the CAT, MRI, and PET scans; the discovery of the relationships between cholesterol and heart disease and between smoking and cancer; the development of x-rays; and the isolation of the virus that causes AIDS. Animal testing played no role in these and many other developments.
The role of animal studies in the development of many treatments has been exaggerated and twisted to fit the goals of those who promote animal experimentation. For example, the development of the polio vaccine involved two separate bodies of work—the in vitro or non-animal studies, which were awarded the Nobel Prize, and the subsequent animal experiments in which close to 1 million animals were killed and which the Nobel committee refused to recognize as anything more than wasteful. Early polio studies on animals misled researchers about the route of infection and delayed the development of a vaccine for decades.
It’s impossible to say where we would be today if we had refused to experiment on animals, because throughout medical history, very few resources have been devoted to non-animal research methods. We do know that animal experiments frequently give misleading results and many believe we’d probably be better off if we hadn’t relied on them and ignored avenues of research more relevant to humans, including epidemiological and cell research.