Modern science is built on inductive reasoning. Accordingly, all scientific knowledge arises from experience. Scientists collect a large number of observations, or often they replicate an experiment. Later, cautiously and with care to consider all interpretations, they draw inferences based on the available data and generalize from them, formulating general principles and making predictions. If insufficient data have been gathered, inductive science can never achieve absolute certainty even if a high degree of likelihood can be claimed.1 In certain disciplines, such as the medical sciences, experiments provide the basis for induction.
That medicine progresses as a result of experimentation was a view already advanced by Claude Bernard.2 However, in practice, the social and human character of medicine obliges us to bear in mind certain inalienable aspects of the human condition. Bernard himself said that scientific progress could not justify trespassing against the well-being of any individual. Experiments must be conducted in a way that shows respect for the ethical principles that have always presided over the practice of medicine, those traditionally attributed to Hippocrates: first, do no harm (primum non nocere).3
Certainly respect for the individual has not always been present. More than once this principle has been violated in experiments performed on groups of persons who were insufficiently informed of what was being done or who could not refuse to participate. In 1714, Charles Maitland inoculated 6 prisoners with smallpox, promising them release4 and Cotton Mather inoculated 2 of his slaves with the virus.5 Antidotes for hemlock were tested in prisoners (1761) and James Lind administered sea water or vinegar as experimental treatments for scurvy (1747).6–8 In 1812, Joseph F. Hernández inoculated 17 prison inmates in Toulon with gonorrheal pus, managing to demonstrate the distinction between gonorrhea and syphilis, thus shedding light on a matter that had been debated since John Hunter infected himself experimentally with both diseases in 1767.9,10 William Wallace memorably demonstrated the infectivity of syphilis by inoculating healthy subjects,11,12 and Joseph Alexandre Auzias-Turenne, inventor of the notion of syphilization13 (by which syphilitics were inoculated with syphilitic material with the intention of curing them), forcibly subjected patients at the St. Lazare hospital for prostitutes to the procedure.14 The method was then practiced by such celebrated dermatologists as Gibert, Sperino15 Hebra, Sigmund,16 and Carl Wilhelm Boeck.17,18 In 1862, Boeck even attempted to treat leprosy by this means.19,20 A few years earlier, in 1803, Thomas Percival had written Medical Ethics, considered the first book on the subject. Percival proposed the idea that when a physician is attempting to test a new medication, he should first seek the opinions of his peers.
German Regulations in the First Third of the 20th CenturyBiomedical research in Germany between 1900 and 1930 was considered the most advanced of its day, not only in relation to progress made in several disciplines but also because of the ethical standards, regulations, and laws in place to protect the subjects of research. In fact, in 1900 the Kingdom of Prussia established the Berlin Code of Ethics (also known as the Prussian Standards), a series of ethical rules regarding human experiments to test new treatments. The code was probably deemed necessary because of the scandal that followed Albert Neisser's public admission of having inoculated prostitutes with syphilitic serum with the excuse of studying the course of the disease, but in fact furthering contagion by means of his experiment.21 This case was not the first time Neisser showed signs of a more than dubious sense of morality. After receiving certain tissue preparations from Gerhard Armauer Hansen and staining them, Neisser claimed he had found the bacillus responsible for leprosy and attempted to discredit Hansen, the true discoverer.22,23
At the end of the 19th century and in the first third of the 20th, unethical behavior was fairly common among researchers, who were more concerned with success in their scientific endeavors than with the morality of their work. Let us cite a few examples:
- -
1880. Hansen, obsessed with growing the leprosy bacillus, inoculated infectious material into a woman's eye. He was unable to grow the germ, but the patient suffered vision problems and reported him to the authorities, leading to Hansen's loss of his hospital post.
- -
1897. The bacteriologist Giuseppe Sanarelli identified Bacillus icteroides as the pathogen responsible for yellow fever in Brazil and Uruguay, providing proof by injecting material from a culture into 5 patients without their consent. Three of them died.
- -
1900. Walter Reed used 22 Spanish immigrant workers in Cuba to test the hypothesis that yellow fever could be spread by mosquito bites.
- -
1906. Richard Strong of Harvard infected Philippine prisoners with cholera so that he could study the disease. Thirteen died and the survivors were rewarded with cigars. During the Nüremberg trials, Nazi doctors cited this study to justify their own medical experiments.
- -
1913. In the US state of Pennsylvania, 146 children were inoculated with syphilis in several hospitals.
- -
1915. Joseph Goldberger, under the supervision of the US Public Health Service, caused pellagra to develop in 12 prison inmates in order to investigate possible treatments.24
- -
1919-1922. At San Quentin Prison in California the testicles of goats or of recently executed prisoners were implanted into the abdomens or the scrota of living inmates.
In 1931, many years after the Berlin Code of Ethics of 1900, the German Ministry of the Interior issued “directives for new therapies and experiments in humans” which incorporated the legal doctrine of informed consent. It was forbidden to experiment on patients who were dying, poor, or socially disadvantaged. It was also stated that proportionality of risk and benefit must be respected and that experiments should first be done in animals.
Medical Experiments in the Third ReichAs detailed in an article in this issue of the journal,62 the 1933 ascent to power of Adolph Hitler's National Socialist German Workers’ Party led to the breakdown of earlier standards of ethical research conduct, completely reversing the fundamental principles of respect for study participants. Acting on the electoral promises that had brought him to power, Hitler set racist policies into place in defense of the “superior race.” Carrying out Hitler's policies required the cooperation of a large number of health care professionals. The first step was the enactment of a law to prevent the transmission of hereditary diseases (Gesetz zur Verhütung Erkrankung Nachwuchses), better known as the sterilization act of 1933. Under this law, a tribunal (Erbgesundheitsgesetz) consisting of 2 physicians and a judge could order the forced sterilization of individuals diagnosed with congenital mental retardation, schizophrenia, manic-depressive psychosis, hereditary epilepsy, hereditary chorea (Huntington disease), congenital blindness or deafness, marked deformities of a hereditary nature, severe chronic alcoholism, and many other conditions. This law was applied along with one passed for the same purpose and using physicians in the same way but directed to sterilizing dangerous criminals (Gesetz Gegen Gefährliche Gewohnheits Verbrecher). Sterilization, which began in 1934 and continued until the start of World War II, was performed on nearly 400000 persons (0.5% of the total population).
The purpose of these and other laws (one to protect the hereditary health of the German people and another to safeguard marital health, known as the Nüremberg Laws) was to eliminate an entire generation of genetically deficient individuals, “purifying” the gene pool and improving the “Aryan race.”25 The benefits that were to derive from applying these eugenics laws were widely and explicitly publicized during campaigns run by the Third Reich's efficient propaganda machine.26
Inside the concentration camps, the large-scale medical experiments performed were of 3 types: 1) research whose purpose was to improve survival for German troops exposed to the weapons of war (gases, incendiary bombs, radiation) or adverse weather conditions (cold, high altitude); 2) the testing of new drugs or surgical techniques; and 3) the proving of national socialist theories of racial superiority (anti-Semitism, eugenics). Other experiments that were completely devoid of purpose other than to cause suffering or exterminate groups also took place. The following are a few examples from this period:
- -
Research on medical treatments for wounds sustained during warfare. Wounds were created and glass and other fragments were introduced; wounds were also infected with Streptococcus species and Clostridium perfringens or Clostridium tetani so as to be able to test the efficacy of treatment with sulfonamides (Ravensbrück, 1942-1943). Mustard gas (Sachsenhausen and Natzweiler, 1939-1945) and phosphorus (Buchenwald, 1943-1944) were used to cause injuries and study their clinical course.
- -
Survival studies. The Nazi medical system quantified the number of days a person could survive drinking only sea water (Dachau, 1944), or living at freezing temperatures (Dachau and Auschwitz, 1941), or in low-pressure chambers that simulated high-altitude conditions (Dachau, 1942).
- -
Efficacy of ingested poisons or poisoned bullets (Buchenwald, 1943-1944).
- -
Inoculation with contagious diseases, including yellow fever, smallpox, typhus, paratyphus A and B, cholera, and diphtheria (Buchenwald and Natzweiler, 1941-1944).27
- -
Sterilization by x-ray irradiation, surgical castration, or injection of various substances such as formol or silver nitrate into the fallopian tubes (Auschwitz and Ravensbrück, 1941-1945). Many camp inmates were irradiated without their knowledge while filling in forms.
- -
Experiments on twins, designed by the Nazi physician Joseph Mengele to demonstrate genetic and eugenic similarities and differences as well as to see whether the human body could be unnaturally manipulated.28 Procedures were performed on over 1500 pairs of imprisoned twins, of whom fewer than 200 individuals survived the experience. Some experiments were absurd. Take, for example, the injection of various substances into the eyes of twins to see if their eye color would change, or the sewing together of twins to see if a siamese pair could be created.29
- -
The euthanasia program (Gnadentod, or so-called mercy killing), which became the systematic extermination of psychiatric patients in gas chambers. Later this method was applied to attempt the genocide of Jews, the Romani, and other ethnic groups.
At the end of World War II, the excesses of Nazi rule led to the drafting of the Nüremberg Code (1947), in which ethical principles and guidelines to protect human subjects of experiments were set out in an effort to reconcile medical research and ethics. Written by physicians Leo Alexander and Andrew Ivy, the code was based on the criteria used to condemn the 24 Nazi physicians tried by the Nüremberg court (1945-1946).30 The studies these physicians carried out were cruel enough to be considered crimes against humanity. One of the charges brought against those in the proceedings known as the Doctors’ Trial was that they had conducted medical experiments without subjects’ consent.31
The Nüremberg Code insisted that the voluntary informed consent of individuals to an experiment, without coercion of any type, is absolutely essential, as are the need to avoid unnecessary physical and mental suffering and evidence that the experiment is needed to yield fruitful results for the good of society.
Unethical Research in Other Places and CircumstancesNazi Germany has not been the only society to engage in ethically reprehensible experiments. Other countries unfortunately continued to carry out such research in spite of international guidelines, bringing into the open the tension between the need for scientific evidence and the procedures used to obtain it.
- -
1931. Cornelius P. Rhoads, a pathologist with the Rockefeller Institute for Medical Research, infected subjects in Puerto Rico with cancer cells. Thirteen of them died.
- -
1931-1933. At the Elgin State Hospital in Illinois, radium-266 was injected into psychiatric patients as an experimental treatment for mental illness.
- -
1941. W. C. Black infected a 12-month-old infant with herpes as part of a medical experiment.
- -
1944. US military doctors infected 400 inmates with malaria in a state prison near Chicago in order to study the course of the disease and develop a treatment. A year later, 800 more prisoners were infected with malaria in Atlanta.
- -
1944. Researchers from the University of Minnesota and the University of Chicago injected phosphorus-32 into subjects in order to study hemoglobin metabolism.
- -
1944-1945. As part of the Manhattan Project, for the development of the atomic bomb, soldiers at Oak Ridge and patients at Billings Hospital (University of Chicago) were injected with plutonium.
- -
1944-1945. The Japanese physician Shiro Ishii conducted various experiments on prisoners to study their resistance to botulism, anthrax, brucellosis, cholera, dysentery, hemorrhagic fever, and x-rays, as well as their tolerance of freezing temperatures. His study included a number of vivisections.32
- -
1945-1949. At Vanderbilt University in Tennessee radioactive iron was injected into poor pregnant women at doses 30 times the toxic level.
In view of such experimentation, the head of the Catholic Church decided to address the issue in a statement defining his view of morality in medical research.33 Speaking to participants of the First International Congress on the Histopathology of the Nervous System (1952), Pope Pius XII defined 3 relevant criteria:
- 1)
The medical researcher may not set aside his ethical obligations.
- 2)
The interests of science and society, of the researcher, and of the individual subject are not absolutes; rather, they are subject to a higher moral authority.
- 3)
Ethical constraints must set limits on science in order to guide it and humanize it.
Later, the Church elaborated further during the Second Vatican Council, particularly in its Pastoral Constitution on the Church in the Modern World Gaudium et Spes.34
The Declaration of HelsinkiBecause the Nüremberg Code's ethical guidelines for experimentation on human subjects were not generally accepted, the World Medical Association was constituted in London in 1946. The first general assembly (Paris, 1947) approved a series of resolutions condemning the actions of German physicians after 1933. The association's eighth assembly in 1954 adopted a resolution on human experimentation which set out principles for those conducting research. In turn, that resolution would lead, in 1964, to the Declaration of Helsinki, which would become the international guidelines of reference for biomedical research, incorporating the spirit of the Nüremberg Code and refining it.
The principle that underlies the code is respect for the individual (Article 8), who has a right to self-determination and to make informed decisions (informed consent, Articles 20-22) that include voluntary participation in research. Such consent is exercised at the start of a study or at any point during its course. The physician's sole obligation is to the patient (Articles 2, 3, and 10) or the volunteer (Articles 16 and 18), and while the need to carry out research is recognized (Article 6), the well-being of the individual subject always outweighs the interests of science or society (Article 5); moreover, ethical considerations must be consistent with legal precepts and regulations (Article 9).35
The Declaration of Helsinki has been revised on a number of occasions (Tokyo, 1975; Venice, 1983; Hong Kong, 1989; Edinburgh, 2000) and has now become the internationally recognized guideline of reference for the ethical conduct of research. Research ethics review boards were introduced in 1975, the use of placebo treatments were regulated in 1996, and the continuance of treatment was guaranteed in 2000. These important issues had wide-ranging impact on regulations within countries and on other international guidelines such as those of the International Council of Medical Organizations (CIOMS).
Beecher's Whistle-Blowing ArticleIn 1966, the New England Journal of Medicine published a devastating article written by Henry K. Beecher, an anesthesiologist and professor at Harvard Medical School who denounced 50 trials that failed to meet current ethical standards but that were nonetheless running in the United States at the time.36 Beecher also cited Pappworth's list of 500 articles in the literature that were based on unethical medical experimentation. Among the studies Beecher called into question, the following provide examples worth contemplating:
- -
1956-1971. The hepatitis study at the Willowbrook State School on New York's Staten Island. In order to study the pathogenesis and epidemiology of hepatitis, the occupants of this facility for mentally handicapped children were deliberately infected with the virus, an act that was explained with the argument that those who had been admitted earlier had been infected spontaneously.37,38
- -
1963. Chester M. Southam, who had already injected live cancer cells into inmates of the Ohio State Prison, repeated that experiment with 22 elderly Afro-American patients at the Brooklyn Jewish Chronic Disease Hospital in order to study their immune response. Southam told the patients they were receiving “some cells” but failed to mention they were cancer cells. He justified not obtaining informed consent by saying he did not wish to cause alarm. Even though the state's medical licensing board placed him on probation, he was paradoxically named president of the American Cancer Society.
The study that has perhaps generated the deepest concern, however, has been the Tuskegee syphilis study (1932-1972), in which black men with this venereal disease were left untreated by US public health services in the state of Alabama. The natural history of syphilis was observed without intervention in 399 black sharecroppers, who were mostly illiterate.
This experiment unleashed great controversy and led to changes in the legal protection afforded patients in clinical trials. The Tuskegee men gave no informed consent to the study and were not told of their diagnosis. Instead, they were tricked into believing they had “bad blood” and in exchange for enrolling in the study were given free medical treatment, free transportation to the clinic, meals, and burial insurance.
When the study began in 1932, the treatments for syphilis (Salvarsan, bismuth, and mercury salves) were toxic, dangerous, and of uncertain efficacy. Determining whether the benefits of treatments made up for their toxicity was an aim of the study, which also sought to characterize the different stages of the disease so that treatments appropriate at different moments could be developed. The doctors recruited 399 black men who supposedly had syphilis, planning to study them over the next 40 years. A group of 201 healthy men were also studied for comparison.
In 1943, treatment with penicillin was introduced.39 Although this drug was safe and was widely available by 1948, the Tuskegee study inexplicably continued until 1972. Those directing the research not only withheld information about penicillin in order to continue observing the progression of the disease until the patient's death, but they even warned the men to avoid treatment with penicillin, which was already being used by other patients in the area. The Tuskegee experiment was not a secret. Its results appeared in the medical literature on several occasions,40–43 yet it was halted only in 1972 after news media exposure.44 Of the 399 subjects, 28 had already died of syphilis and 100 had died of related medical problems. Additionally, 40 women were infected and 19 children were born with the disease.45 An expert report eventually published about the study concluded that “society can no longer afford to leave the balancing of individual rights against scientific progress to the medical community.” Years later, in 1997, US President William Clinton offered a public apology for the Tuskegee experiment.46
BioethicsThe term bioethics was coined in 1970 by V. R. Potter47 in the context of the problems technology was introducing into a world in the midst of a crisis of values. Bioethics can be defined as “the systematic study of human conduct in the area of the life sciences and health care, insofar as this conduct is examined in the light of moral values and principles” (p. xix).48 Today we can perceive a break between science and technology on the one hand and the humanities on the other. The rupture has its roots in the enormous growth of technology, which gives us the power to manipulate the most intimate aspects of a human being and to alter the environment, in the absence of a similar growth in responsibility through which our new-found power ought to be harnessed for the benefit of mankind and the environment.49
The effort to bridge the gap between experimental science and the humanities50 gave rise to a discipline from which we expect to see the formulation of principles that will allow us to cope responsibly with the enormous technological potential that would have been unimaginable only a few years ago. Bioethics, which is developing at a fast pace, not only contemplates systems to guide ethical medical practice but also looks at standards for guiding experiments in human beings.51
The Belmont ReportFollowing the scandal of the Tuskegee experiment, and based on the work of the National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research (1974-1978), the US Department of Health, Education, and Welfare revised and broadened the regulations protecting human subjects at the end of the 1970s and beginning of the 1980s. In 1978, the commissioners drafted a statement in which they specified ethical principles and guidelines for the protection of human subjects of research. This paper is known as the Belmont Report, reflecting the name of the conference center where the commission was convenened.
The fundamental ethical principles of the comission's various reports were collected into a single document, and explanations and recommendations were added. The 3 ethical principles on which research on human subjects must rest are as follows:
- -
Respect: This principle protects the right of individuals to be treated as autonomous agents and requires their voluntary informed consent to participation.
- -
Beneficence: The possible scientific benefits must be maximized and the possible risk to subjects minimized.
- -
Justice: Procedures should be done with good reason and managed properly. Subjects should not be exploited.
Today, the Belmont Report continues to be an important guide for investigators and groups conducting research on human subjects, assuring that projects keep within ethical boundaries.
The CIOMS GuidesIn 1982, the CIOMS, in collaboration with the World Health Organization, proposed international guidelines for biomedical research in humans. These guides were developed mainly to establish the basis for applying the principles of the Nüremberg Code, the Declaration of Helsinki, and especially the Belmont Report in developing countries. The guides pay particular attention to the cultural and socioeconomic situations of these countries, attempting to regulate the possible use of human beings for experimental purposes, especially with regard to large clinical trials of vaccines and drugs, with special reference to the context of AIDS research. The 15 CIOMS guides have been updated several times.
ConclusionsMedical research is not always accompanied by the desired respect for ethical standards. The practice of self-inoculation with infectious material (such as of gonnorheal pus by John Hunter,52 of the Peruvian wart by Daniel Alcides Carrión when he was a student,53 of fungi by Köner or Strube,54 and others) soon led to a search for so-called human guinea pigs. They could be found most readily available in subjugated populations (imprisoned criminals or prisoners of war, inmates of concentration camps), among outcasts (psychiatric patients, individuals hospitalized with chronic conditions), or among groups living on the edges of society (prostitutes, the homeless, or ethnic groups bearing the brunt of racial prejudice).
Unethical practices were not specific to the Nazi regime (although they were certainly present on a massive scale in the Nazi concentration camps), but rather have been seen (and continue to be seen) in many places and under many circumstances. Africa, for example, has become a continent where drugs can be tested outside the bounds of international standards. In 1966, trovafloxacin, an antibiotic still under study at the time, was administered to children in Kano, Nigeria, by a multinational drug company; 11 died and dozens of severe complications (deafness, blindness, arthritis, liver toxicity) ensued.55,56 Wars also continue to be settings for human experimentation. Soldiers in Iraq for the Gulf War were inoculated with Mycoplasma incognitus and exposed to a variety of chemical agents, radiation, and drugs that left 100 000 US American and 6000 British troops with sequelae (tiredness, neurologic disorders, loss of memory, malformations in offspring, etc).57–59 Suspicions have been voiced, or accusations made, about possible medical experiments in Guantánamo Bay in Cuba,60 in Israel,61 and in several countries of the Third World.
Not even the great figures of medicine have been free of the taint of these deplorable practices. The article in this issue, by Cuerda and colleagues,62 which served as the point of departure for the present opinion article, suggests that we remember that the eponymous names of diseases we use today are linked to physicians who clearly engaged in immoral behavior at times.
Nonetheless, I am not sure that their names should be forgotten. Reprehensible as their careers might have been, they did make some useful contributions which rightly or wrongly have become part of our medical language. I do not believe that at this point we need to question the name of the bacterium Neisseriae gonorrhoeae because Albert Neisser behaved unethically. These problems arise to one degree or another in relation to all historical figures. Julius Cesar or Hernán Cortés did not always behave properly, but it does not occur to anyone to erase their names or forget their deeds as part of a pointless gesture obliterating the memory of their names (damnatio memoriae).
However, a separate question is that of shedding light on the dark sides of these figures’ lives, and perhaps it would be useful to do so. Above all so that we will not follow in their footsteps. So that we avoid idealizing persons who were perhaps interested in medicine but who displayed profound contempt for human beings and their inalienable rights. And so that we might remember that sometimes the unbridled desire for professional glory can lead us to forget to exercise the scruples and intellectual humility that can help us avoid the bias that comes with pride, egotism, and pretentiousness. Scruples and humility can point the way instead toward spiritual rectitude and encourage the search for humankind's best attributes. In the end, our most sacred duty is to share and enjoy life with our fellow human beings.
Please cite this article as: Sierra X. Dermatomiositis: estudio y seguimiento de 20 pacientes. Actas Dermosifiliogr.2011;102:395-401.