The firm knowledge that bacteria were causes of diseases and were the transmissible agents responsible for contagion was acquired in the nineteenth century, but the idea that there were tiny creatures which could produce illness had been held for thousands of years. Varro, in the first century B.C., had said that swampy land was dangerous because “certain minute animals, invisible to the eye, breed there, and borne of the air reach the inside of the body by way of the mouth and cause disease.” In the Middle Ages, shunning lepers, fleeing from areas of pestilence, and segregating the severely ill all represented an awareness that diseases could be transmitted. In the sixteenth century, Fracastoro demonstrated extraordinary perception in his assumption that there were “seeds” in the environment which could multiply in the body and produce disease. His contemporary Giralamo Cardano reasoned that these “seeds of disease” were live creatures. Athanasius Kircher, a Jesuit cleric of the seventeenth century in Rome, saw through a crude, early microscope that vinegar and sour milk contained “worms” and that the blood of people who died of the plague harbored animalcula.
Among the early proofs that living organisms could cause disease was the discovery of harmful parasites and fungi. In 1589 Thomas Moffet made accurate drawings of lice, fleas, and mites. He even described the mite which caused scabies and advised combating it with sulfur, a medication that remained the effective treatment for centuries. Moffet also discovered the disease which infects silkworms hundreds of years before Pasteur, but his reports were ignored.
The discoveries of Leeuwenhoek in the seventeenth century had led to the microscopic observation of other heretofore unseen creatures, but they were regarded as incidental findings. Furthermore, many fantastic tiny creatures were imagined rather than seen under the early microscopes, so the possible association of microscopic animalcula with illness was for the most part shrugged off.
Evidence slowly began to accumulate. When Agostino Bassi of Lodi (1773-1856) in the late eighteenth century linked a disease in the silkworm with a fungal parasite (Botrytis paradoxa), he enlarged his views to suggest that many contagious diseases such as smallpox, typhus, plague, and cholera were also due to live organisms—as yet undiscovered. Jacob Henle in the mid-nineteenth century deduced from earlier published reports that living organisms were indeed the causes of infections. In promulgating a series of precise requirements for a specific organism to be considered the causative agent, he antedated by several decades the postulates of his pupil Robert Koch. He even theorized that the parasites which caused the contagious illnesses were probably plants, and bacteria have been so classified until very recently.
The first studies of the pathogenic nature of bacteria were on a relatively large-sized and easily seen bacterium, the bacillus (rod-shape) of anthrax, a fatal disease in sheep and horses. Casimir Davaine and Pierre Rayer in 1850 produced the deadly disease in healthy animals by injecting them with the blood of dying sheep. They subsequently found the anthrax organism in the blood of the sheep so killed. Others were able to repeat and confirm the experiments.
The ancient view that life could arise from inanimate substance was still a widely held concept in the nineteenth century, for it seemed logical to believe that maggots commonly found in decaying matter were hatched through fermentation and putrefaction. When bacteria were seen under the microscope to be ever-present in sour milk and spoiled meat, it appeared rational to assume that they similarly arose as a result of chemical processes. Even though scientists in the seventeenth century had discovered that maggots came from eggs deposited in rotting matter by adult insects, opinion was still strong that they were generated by fermentation and putrefaction. In the eighteenth century, Lazaro Spallanzani clearly proved that no organisms could develop in a sealed flask of liquid heated long enough to destroy any living creature, but belief in spontaneous generation remained rooted in scientific thought. In the early nineteenth century, Theodor Schwann concluded that the chemical processes of fermentation and putrefaction were themselves the result of activity by the live organisms. It is interesting that commercial producers of food and wine made practical use of these ideas long before scientists appreciated their implications.
Contagion and Biologic Prevention of Disease
Although belief in religious and supernatural causes and cures for disease was predominant among ancient peoples, many realized that some illnesses could be transmitted or even prevented by nonreligious means. For instance, the Indians and the Chinese learned that purposely contracting a mild case of some diseases could confer a resistance to subsequent occurrences of the illness. This seemed to indicate that disease could be passed from person to person without divine intervention. Centuries later a battle developed in Europe and the United States between those who believed that diseases were definitely contagious and those who ascribed epidemic illnesses to causes such as environmental change and internal bodily derangements. The controversy reached its height in the eighteenth century.
The anticontagionists, who numbered among them some outstanding scientists and physicians, had noticed that quarantine was not convincingly successful and that an epidemic such as yellow fever was often terminated by a change in the weather. Furthermore, they observed that even people in contact with yellow fever victims did not necessarily contract the disease. (They did not know that mosquitoes were responsible for transmitting the infective agent or that their absence in winter ended the threat of being bitten and infected.) In addition, the apparently spectacular cures of yellow fever with drastic purges reported by Benjamin Rush in the American colonies convinced many that contagious infective organisms could not be the cause. Indeed, had not other completely different causes of disease been proved, such as a dietary insufficiency in scurvy? The fact that epidemics were most frequent in the crowded conditions of slums was interpreted by the anticontagionists as additional evidence that the environment was the prime cause—unhealthy air, poor food, polluted water—rather than living creatures. On the other hand, the spectacular results of inoculations to prevent smallpox were a strong, added argument in favor of those who believed disease to be transmissible. Edward Jenner had introduced the new concept of creating an immunity to a dangerous disease by producing an entirely different mild illness through vaccination. Therefore, as innovative as Pasteur’s work was to be and as skeptical as many scientists were, the use of biological methods to prevent disease had already entered the atmosphere of medicine.
Asepsis and Semmelweis
Nevertheless, even those who accepted the principle that disease could be transmitted from person to person apparently failed to see the connection between contagion and the gangrenous complications of surgical wounds or the fatal childbed (puerperal) fever. In the eighteenth century, Charles White of England and Joseph Clarke and Robert Collins of Ireland had sharply reduced the occurrence of postpartum sepsis (infection) by regimens of personal and environmental cleanliness, limited vaginal examinations during labor, and active cleansing of beds and linen. Yet virtually no one seems to have continued these practices. Even when Oliver Wendell Holmes in 1843 attributed puerperal fever to infections carried to the new mother by obstetricians from other infected persons, most physicians regarded this as impractical theorizing without proof. It was Ignaz Semmelweis (1818-65), in keeping with the new statistical spirit of the nineteenth century, who assembled the facts and analyzed the happenings on the obstetrical wards of the Allgemeines Krankenhaus in Vienna to prove the contagious nature of postpartum infection.
He noted that the annual mortality rate on one of the wards where medical students were trained was over ten percent and that it reached almost twenty percent during some months—chiefly due to puerperal fever. On the ward where midwives received instruction, the deaths never even reached as high as three percent. Ignoring the seemingly obvious conclusion that staff skills on the second ward were therefore superior, he discovered that the doctors and students usually came to the ward to examine patients directly from the autopsy room. In contrast, the midwives and their teachers on the second ward participated in clinical instruction without attending the autopsies. Furthermore, he noted that the women who came down with infections were usually in a row of beds conforming to the routine of examination that day. The suspicions of Semmelweis were further confirmed when he viewed the autopsy of his colleague Kolletschka, who had died of an infection from a scalpel wound sustained while performing an autopsy on a puerperal fever victim. His friend’s organs showed the same changes as seen in the bodies of those dead of postpartum infection.
The next step for Semmelweis was clear: to require the physicians and students under his charge to scrub hands with soap and water and soak them in a chlorinated lime solution before entering the clinic or ward, and to repeat this after each examination. Despite complaints, he persisted in his demands. Over the next few months, the eighteen percent obstetrical death rate declined startlingly to a low of one and two-tenths percent. One might have expected the hospital staff immediately to have embarked upon a similar regimen, or, at the very least, to have tested the conclusions further by repeating the clinical experiment. Instead, the chief of service, apparently for reasons of personal antagonism, condemned Semmelweis, arranged to have him lowered in rank, and further limited his practicing privileges. When he reported his results to the Medical Society of Vienna, his paper was greeted with virulent attacks. Although some outstanding medical scientists and practitioners, such as Rokitansky in pathology, Skoda in medicine, and Hebra in dermatology, supported Semmelweis, he was too deeply hurt to remain in Vienna and returned to Budapest, where his methods effected a marked diminution in mortality rates. Indeed, Semmelweis may be credited with having for the first time constructed a statistically tested system of asepsis (keeping germs away from the patient) before the germ theory had arrived.
When he finally completed a book, Die Aetiologie der Begriff and die Prophylaxis des Kindbettfiebers, in 1861, ten years after his original discoveries, the profession hardly took notice, and prestigious scientists like the great Virchow actually opposed his ideas. The brilliant, intense, and sensitive Semmelweis ultimately was broken by the indifference and callousness of his superiors and colleagues. Committed to an asylum, he died in 1865 of a blood infection, virtually the same kind of illness that had stricken the mothers he had tried to save.
Antisepsis and Lister
In contrast to Semmelweis, Joseph Lister (1827-1912) had the advantages of a prestigious position in Glasgow, an intellectual climate already modified by works on infection and germs, an ability to present views in a simple manner, and an equanimity that enabled him to continue on his course undeterred by criticism.
Among the variety of substances used on wounds from earliest times, some like wine and turpentine were probably antiseptic in effect while others no doubt contributed to infection. Pus was generally the expected concomitant of wounds, but there was virtually no understanding of how it was produced. Simpson of St. Andrews in the eighteenth century had realized that pus came in some way from capillaries, and Julius Cohnheim (1839-84) later proved that white blood cells (pus cells) indeed migrated out through the walls of tiny blood vessels into inflamed tissues. However, these and other reports gave no hint that the inflammations and putrefactions which brought forth pus could be due to agents passed from person to person.
Lister noticed that broken bones over which the skin was intact usually healed without complication, but fractures where bone was exposed through breaks in the skin commonly developed infections and the drainage of pus. He saw the frequent severe infections attending other operations, such as amputations, as additional evidence that something circulating in the air was responsible—possibly invisible particles which he called “disease dust.” When the work of Pasteur in 1860 was brought to his attention by Thomas Anderson, he appreciated the connection between his own observations on wounds and the microscopic bacteria involved in fermentation. Whereas Pasteur used heat to sterilize, Lister sprayed carbolic acid over the patient during an operation to kill any bacteria before they could grow in the wound. In 1867 he published a report in the Lancet of his experiences with eleven cases, and he gave full credit to Pasteur’s work.
As had happened with Semmelweis, Lister’s reports were received with either indifference or open hostility. In the United States, where Lister went to present his views, surgeons remained generally unconvinced as late as the 1880s. Nine years after Lister’s publication, Samuel Gross (1805-84), the virtually unchallenged leader of American surgery, could write, “Little if any faith is placed by any enlightened or experienced surgeon on this side of the Atlantic in the so-called carbolic acid treatment of Professor Lister.”
Many European and American surgeons failed to see the implications of the revelation that infections in wounds came from something foreign introduced at operation: that these invaders should be eliminated or excluded. Instead they focused on the antiseptic itself (carbolic acid) and the mechanics of its use in sprays and soaks, thereby missing entirely the main concept. Yet, almost subconsciously surgeons had absorbed Lister’s principles of keeping out germs. B. A. Watson of New Jersey could later say, “We find scarcely a wound treated in the United States today but what some part of Listerism is adopted.”
Perhaps the first prominent support of Lister’s theories came from Saxtorph of Copenhagen in 1870. Surgeons in the German-speaking countries also readily recognized the part played by bacteria in infection and saw the importance of asepsis and antisepsis. Among the notable surgeons who followed the methods of Lister were Von Volkmann, Von Langenbeck, Czerny, and Von Miculicz. Billroth, one of the greatest innovators and teachers of surgery, did not believe that bacteria were important in wound infections but was willing to use the Listerian system after seeing that it yielded consistently better results.
French surgeons also subscribed to the new doctrine relatively early, so that in 1876 Lucas Championniere could write, “A few years ago Paris hospitals were reckoned among the very worst, even by some of their own surgeons. Now surgery may be carried out in them as anywhere else.” Increasing numbers of surgeons in other countries took up the system. Some who observed strict rules of cleanliness nevertheless denied the role of germs. Lawson Tait in England, while criticizing the antiseptic carbolic methods and denigrating the germ theory of disease, nevertheless observed strict rules of cleanliness in what came to be called the “aseptic” (killing or excluding germs before they enter a wound) system, as opposed to the “antiseptic” (killing or removing germs after they have entered). It was Lister’s great contribution to emphasize in the minds of surgeons the necessity for getting and keeping wounds free of contamination.
The employment of rubber gloves in operations was an innovation of the early twentieth century. When William Halsted introduced them to protect the hands of his operating-room nurse (whom he later married), one of his students suggested their use by operators too, since they could be sterilized. At first the gloves were relatively thick, and many surgeons refused to wear them. Even when the rubber was made thinner, some operators, especially in Europe, wore sterile cloth gloves over the rubber. Masks were brought in even later, and as recently as the 1940s and 1950s many highly placed surgeons left the nose uncovered, wearing the mask over the mouth only.
It was the monumental work of Louis Pasteur (1822-95) which simultaneously demolished the theory of spontaneous generation, firmly established the germ theory of disease, explained the effectiveness of the asepsis and antisepsis of Semmelweis and Lister, and laid the basis for the biological preventive measures of the future.
Trained as a chemist, Pasteur found himself ever more involved in biological phenomena and thus by example demonstrated the interrelation of the branches of science. Early in his career, at his installation as professor and dean of the faculty of science at Lille in 1854, he stated, “In the field of observation, chance only favors prepared minds.” His own activities were to be a striking illustration of this aphorism.
Pasteur’s earlier research project in 1851 had been a major step in the development of stereochemistry, for he proved that there were two types of tartaric acid crystals. When he discovered that bacteria acted on each type of crystal differently, he applied these observations to the problems of beet alcohol production, which he was asked to investigate by a local manufacturer. The result was not only a solution to the production difficulty but an entirely new focus for Pasteur’s energies, which was to culminate in the germ theory of disease. In rapid order he proved that microscopic live creatures were responsible for fermentation, and that some of these tiny organisms grew in the presence of oxygen (“aerobic”) while others lived in the absence of free oxygen (“anaerobic”). He also determined that heating wine for a few moments at about 60°C. (108°F.) destroyed the organisms which produced spoilage—a process later called “pasteurization.”
When an epidemic in silkworms threatened to devastate the industry, he discovered that two different diseases were responsible: pebrine, caused by infection of the eggs, and flacherie, due to an infective intestinal organism. Weeding out the affected eggs and removing the source of infection in the food saved the silkworm industry not only in France but also throughout the world.
Suddenly he suffered a stroke from which he recovered only slowly and incompletely. Having already accomplished myriad far-reaching innovations—enough to earn him a place among the greatest scientists of history—he might well have declared a moratorium on further work. But the most famous and useful of his contributions were yet to come.
His next step was to study the fatal illness in sheep called anthrax (“splenic fever”). He was able to isolate the anthrax bacillus, confirming Koch’s earlier work, but he could not see a way to prevent or treat the disease. Therefore, when he was called to investigate the cause of chicken cholera after serious losses to poultry farmers, he could not have foreseen that his study would lead to a means of preventing anthrax and, more importantly, to a discovery that would revolutionize preventive medicine.
On returning from a vacation, he found that cultures of the chicken cholera organism prepared before he left were harmless when injected into healthy fowls, but subsequent injections of fresh virulent cultures into the same hens failed to produce the disease. Armed with this new knowledge, Pasteur treated cultures of the anthrax bacillus in various ways until he found that microbes grown in a particular temperature range became harmless without losing their capability of producing resistance in injected animals. To test the validity of his discovery, a public demonstration was arranged by the Melun Agricultural Society in 1881. While a skeptical assemblage of physicians, veterinarians, curious citizens, and reporters looked on, virulent cultures of the anthrax bacillus were injected into normal sheep and into an equal number of sheep previously inoculated with attenuated, harmless cultures. In the next few days, all of the unprotected sheep died and all of the prepared sheep remained well. The principle of immunity was thus publicly and dramatically launched. Whereas Jenner had obtained the same protection from smallpox by producing another illness, vaccinia or cowpox, Pasteur established the fundamental principle that attenuated cultures of an organism could afford protection against the disease caused by the organism. Just as Lister had given recognition to Pasteur for his importance to antisepsis, so now Pasteur paid tribute to Jenner’s work by calling his own method “vaccination.”
Although it had been known that the “poison” of rabies (hydrophobia) was in the saliva of afflicted animals, Pasteur reasoned from the symptoms that it must also reside in the central nervous system. Working with the spinal cords of rabbits he confirmed his suspicions, and, extending his findings with attenuated cultures of anthrax, he developed an extract containing the offending agent of rabies in a nonvirulent form. Having developed a system of protecting rabbits by injections of extracts of more and more virulence, he awaited the chance to apply the method to humans.
In 1885 a young boy named Joseph Meister, who had been bitten by a rabid dog, was brought to him. Pasteur first consulted two physicians who agreed that the boy’s outlook was hopeless. One can well understand his caution, for today such a course of action with an untried medication containing dangerous constituents would be unthinkable. But Pasteur proceeded. As the virulence of the injections was increased, he watched ever more closely for signs of rabies, which would have developed in three to six weeks. When the boy remained well after the final inoculation with an extract that ordinarily was rapidly fatal to rabbits, Pasteur knew that his hypotheses and experiments had been correct. The success of the antirabies inoculation gained him widespread public acclaim, for it represented the first time that Pasteur’s methods had been applied directly to humans.
From then on, bacteriology and immunology followed an ever-widening course. Pierre-Paul-Emile Roux (1853-1933), Pasteur’s pupil, reported finding a filterable virus (one that passes through the finest filters). Together with Alexandre-Emile Yersin (1863-1943) he isolated the diphtheria bacillus and developed an antitoxin. More and more bacteria were discovered, numbers of vaccines and antisera were produced, and the mechanisms of prevention became increasingly clarified.
While practicing as a country physician, Robert Koch (1843-1910) used his spare moments to begin studies on microorganisms. By the time of his death, he had revolutionized bacteriology, established the sporulation and pathogenic character of the anthrax bacillus, developed and refined techniques of culturing bacteria, advanced the method of steam sterilization, discovered the causes of many diseases (including wound infections, cholera, Egyptian ophthalmia, and sleeping sickness), and introduced effective preventive measures in typhoid fever, plague, malaria, and other diseases. Perhaps his two most influential contributions were the isolation of the tubercle bacillus, the cause of tuberculosis, and the establishment of the essential steps (“Koch’s postulates”) required to prove that an organism is the cause of disease. His studies with tuberculin (a filtrate from the culture of the tubercle bacillus) convinced Koch and the world that he had found the means of curing tuberculosis, but its subsequent failures were a severe blow to him. However, Koch’s tuberculin is still used as a diagnostic tool.
Ferdinand Cohn (1828-98) of the University of Breslau helped to establish the science of bacteriology through his techniques and concepts, including a definitive classification of bacteria. He also gave vigorous support to Davaine, Koch, and others who worked in the field. So many other contributors to microbiology have followed in the nineteenth and twentieth centuries that merely listing the names would fill a volume.
In the last decade of the nineteenth century, two especially important additions were made to the understanding of infections: the development of antitoxins and the discovery of viruses.
After Yersin and Roux had found that the toxins produced by the bacteria of diphtheria—not the microorganism itself—could cause the damage to the body that made up the clinical picture, other bacterial diseases were similarly studied. The independent proofs by Von Behring and Kitasato that the body manufactured circulating substances which acted against the toxins prompted Von Behring to obtain neutralizing antitoxic sera from the blood of animals injected with intermittent doses of the toxins. This use of antitoxic sera in treating diseases was called “passive immunization” by Paul Ehrlich, in contradistinction to “active immunization” by inoculating attenuated cultures of the pathogenic organism or its toxin to call forth protecting antibodies. Jenner’s vaccination against smallpox and Pasteur’s antirabies inoculations were examples of the active production of immunity.
The recognition of antibodies in the blood of a person sick with an infection became useful in diagnosis: finding the specific antibody against a particular germ in the blood determined which organism was the cause. Because of the obvious part played by the circulating toxins and antitoxins, the importance of biologically produced chemical substances appeared to overshadow the role of cells in fighting disease until Elie Metchnikoff (1845-1916) proved that some cells destroyed bacteria by engulfing them (phagocytosis) and by elaborating antibodies against them, for which he was awarded a Nobel prize in 1908. The realization that humans could be carriers of the pathogenic germs of cholera, diphtheria, typhoid fever, meningitis, and dysentery without being ill themselves was also a forward step in understanding host resistance to disease and was also a significant contribution to public health.
Most of the germs responsible for illness could be seen under the microscope to be bacteria. The organisms for smallpox and rabies had not been seen, but the presence of their “poisons” had been appreciated and utilized in immunization. When Loeffler and Roux reported that there were pathogenic organisms (in a disease of cattle) which were so small that even fine filters allowed them to flow through, a new field of investigation, virology, was opened. Subsequently, other research revealed that there were also tiny microbes of a size between bacteria and viruses (named rickettsia for Howard Ricketts [1871-1910], who found the cause of Rocky Mountain spotted fever). Larger protozoan organisms were also found to produce diseases (for example, the plasmodium of malaria).
Intermediate Vectors of Disease
Studies by veterinarians and zoologists showed that some disease-producing organisms underwent life cycles in animal hosts, in the course of which they could infect humans: for instance, lice in typhus, ticks in filariasis, and the tsetse fly in sleeping sickness. The role of these intermediaries in the transmission of disease was suggested by a number of investigators, notably Patrick Manson, Theobald Smith, and F. L. Kilborne.
The mosquito as an insect vector in malaria was partially understood in the eighteenth century by Lancisi and clearly perceived by several observers in the nineteenth century—Beauperthuy of Venezuela, King in the United States, Laveran in France, Flugge and Koch in Germany. In 1880 Charles-Louis-Alphonse Laveran (1845-1922) actually demonstrated the causative organism, a protozoan, but the mechanism of transmission was not proved until fifteen years later by Ronald Ross (1857-1932), who found the parasite in the stomach of an anopheles mosquito which had imbibed the blood of a person with malaria. He was able to transfer the disease from malarial birds to healthy birds through the bites of mosquitoes. This was subsequently proved in 1898 by Grassi and collaborators in Italy also to be the mechanism of transmission to humans.
Although Beauperthuy had stated in 1853 that a mosquito carried yellow fever, Carlos Finlay (1833-1915), a Cuban physician, clearly enunciated the proposition in 1881 that the Aedes aegypti was the insect vector responsible for communicating yellow fever from person to person. When yellow fever became a major problem to the United States following its occupation of Cuba after the Spanish-American War, the army sent a group to seek a solution. Walter Reed (1851-1902), the chairman, James Carroll, Jesse Lazear, and Aristides Agramonte devised human experiments to test Finlay’s theory. Members of the commission itself, soldiers of the occupying force, and civilian employees volunteered as subjects. Lazear, an accidental victim of the disease, died during the investigation. The others recovered.
The commission reported in 1901 that the mosquito was an intermediate host of the disease, transmitting the protozoan parasite from person to person through its bite; that the causative organism in the blood of infected humans was a filterable virus; and that the disease could be transferred only through the mosquito’s bite and not by direct contact between persons. The resulting public health measures—directed to removing mosquitoes and protecting humans against the insects—eliminated yellow fever from Havana in less than a year. William Crawford Gorgas (1854-1919) was the leader of the sanitary engineering force which achieved the results.
Ehrlich and the Beginnings of Antimicrobials
The most important impetus to discovering ways of fighting microorganisms was given by the ideas and works of Paul Ehrlich (1854-1915). Through the centuries, of course, there had been a few relatively effective anti-infective agents in use, but the establishment of a discipline to seek agents against bacteria and to construct criteria for evaluating effectiveness was accomplished almost single-handedly by Ehrlich.
As a student he examined the methods of making cells visible under the microscope, and he wrote his doctoral thesis on histologic stains. From earlier investigators he took the relevant information which enabled him to develop a theory of specific affinity of cells for dyes: George Hayem’s techniques of coloring living cells (vital staining); Hermann Hoffmann’s stains of bacteria; the visualization by Carl Weigert of bacterial cocci in tissues. His demonstration of the staining characteristics of the white blood cells with aniline dyes allowed others to understand better the abnormalities of blood cells, thus contributing to the foundations of hematology. He also improved the methods of identifying the tubercle bacillus microscopically. Through techniques of neutralizing toxins with antitoxins in the test tube he illuminated the detection of antibodies by showing that some were heat-resistant and others heat-susceptible. His “side-chain theory,” abandoned for many years, nevertheless led others to investigations of importance to immunology. Moreover, in recent decades this theory has been revived by contemporary immunologists, albeit in a more sophisticated form.
As he gained recognition, he enlarged his scope of activities even further. When Fritz Schaudinn in 1905 proved the cause of syphilis to be the Treponema pallidum, Ehrlich synthesized chemicals which would destroy the causative organism while sparing the patient. Salvarsan, the arsenical which he made in his laboratory in 1910 (the 606th compound tried), became the standard effective therapeutic agent. Other less toxic arsenic compounds, such as neoarsphenamine, also arose out of his work. For the first time in history the patient with syphilis had a good chance to survive and be cured.
About three decades later, the advent of the sulfonamides for the treatment of bacterial infections was a direct, though delayed, outgrowth of Ehrlich’s demonstration that dyes could be antibacterial agents. When penicillin was introduced, Ehrlich’s drugs against syphilis were abandoned, but he had set in motion the activities of the twentieth century that were to revolutionize the therapy of microbial diseases.