The development of a cure for cancer would be as welcome to the human race as the attainment of world peace. In fact, the United States declared a “War on Cancer” with the National Cancer Act of 1971 and since then there’s been progress in many fields, though V-C Day has yet to be celebrated. Cancer is an insidious collection of diseases and research proceeds in many directions. From 1950 to 1972, a project to develop total body irradiation (TBI) yielded a spinoff: the U.S. Department of Defense believed that cancer patients who had received such radiation treatment — customarily administered along with chemotherapy during bone marrow transplant — bore a similarity to radiation exposure of troops on a battlefield following the detonation of an atomic bomb. Military officers especially wanted to know how quickly a soldier suffering the effects of radiation could recover and return to action. The Pentagon even helped fund the studies.
The story of the linkage between a cure for cancer and preparedness for war has now been detailed in “Contested Medicine: Cancer Research and the Military”, by Binghamton University history professor Gerald Kutcher.
The TBI treatment and experiments on human subjects were conducted at the University of Cincinnati College of Medicine by a team headed by radiologist Dr. Eugene L. Saenger. The subjects were unaware of being surrogates for soliders in an atomic attack, and there was concern that the radiation received may have been excessive or even unnecessary, possibly hastening the death of several patients. Professor Kutcher scrupulously follows the Saenger experiments, the reaction within his institution and the later Congressional investigations.
The Cincinnati experiments influenced the creation by the Federal government in 1994 of the Advisory Commitee on Human Radiation Experiments (ACHRE). But even after exhaustive study of Saenger’s work (record-keeping was inconsistent, many documents were lost or had faded and physicians’ scribbly handwriting complicated the studies) ACHRE was unable to come up with a fixed set of guidelines for bioethics, citing “the dynamic character of medical research” as well as “community standards”.
Since 1947 and the discovery of sadistic Nazi “medical experiments”, world standards in medical research have been guided by the Nuremberg principles — named for the German city that hosted the German war crimes trials following World War II. The primary standard is voluntary patient consent. This was later supplimented in the United States by rules formulated in 1966 by the National Institutes of Health and the Food and Drug Administration. Kutcher sees the NIH-FDA involvement as serving primarily to minimize risk (and avoid lawsuits) rather than enhance patient autonomy.
Gerald Kutcher is singularly prepared to tell about “Contested Medicine”. He is now associate professor of history Binghamton University and chair of the Department of History, and holds a Ph.D. in the history and philosophy of science from Cambridge University. However, prior to turning to the calling of historian, Kutcher had earned a doctorate in physics and for twenty years was a radiation physicist and Chief of Service in Clinical Physics at Memorial Sloan-Kettering Cancer Center in New York.
Gerald Kutcher joined Bill Jaker on OFF THE PAGE to review the development of medical ethics and the social structures behind ethical standards, and to respond to listeners’ questions about treatment and research.