Although the early decades of the nineteenth century were a virtual continuation of medical developments in the previous century, two particular advances (anesthesia and the discovery of microorganisms as causes of disease) so altered the course of medical history that concepts of illness, methods of treatment, and hygienic practices at the end of the century bore only slight resemblance to what they were at the beginning. Of course there were other highly significant contributions which advanced the understanding of the structure and function of the living organism, such as the demonstration of the cell as the fundamental anatomical unit, the enunciation of the physiological principles relating to the internal environment of the body, and the introduction of new diagnostic tools into clinical methods. The effects of these other advances were more cumulative, producing their greatest impact in the following century.
On the other hand, the organization of physicians, hospitals, and public health activities arose out of the nineteenth century itself, owing much to the alterations wrought by the Industrial Revolution. The rapid changes that followed the building of factories and the expansion of cities led to extreme shifts and crowding of populations. The conditions of factory workers, the spread of slums, and the interdependence of communities and nations also affected medical practice.
Before the discovery of bacteria as the causes of disease, the principal focus of preventive medicine and public health had been on sanitation: the provision of potable water and the dispersal of foul odors from sewage and refuse, which were considered the important factors in causing epidemics. The invention of the water closet by John Harrington (1561-1612) facilitated flushing away human waste and helped to keep some dwellings clean, but the flow from these indoor privies ran into cesspools and ultimately into waterways and wells.
The health of workers in the factories was important to their efficient functioning, and since the spread of epidemic disease was a danger to all segments of the population, the need for remedial measures in public health was urgently appreciated. Johann Peter Frank’s System einer Vollstandigen medicinischen Polizey (1777-78), using statistics to establish the importance of public health, was a milestone even though its immediate influence was negligible. In 1848, the description by Edwin Chadwick of the sanitary conditions and health of English workers, however, did have a great impact on the upper classes and the governing bodies. His standards for the proper removal of sewage and the protection of water supplies was a stimulus to the government of Britain, as was Rudolf Virchow’s militant advocacy of public health measures in Germany.
Epidemics continued to devastate cities and countries. As late as 1854 in London there were 14,000 cases of cholera with 618 deaths. In the United States, cholera ravaged the entire country three times in the nineteenth century. Yellow fever, after striking the northeast from 1793 to 1805, began a series of attacks upon the southern and Gulf of Mexico ports which reached a peak in the 1850s. A sharp decline followed until 1905, when one last explosive outbreak killed over 450 residents in New Orleans. Planned attacks on cholera, typhoid fever, and other pestilences only became feasible after the causes were discovered in the bacteriological era.
The assembling of large numbers of raw troops for the American Civil War was accompanied by inevitable outbreaks of communicable diseases. On both sides little attention was paid to camp sanitation, housing and food were atrocious, and confusion was rampant. No one anticipated the enormous casualties in the first few battles, and seriously wounded men often lay where they had fallen for several days. Many wounded died for want of immediate care, but the North, which lost control of the battlefields in the early fighting, took the heaviest casualties.
Gradually both sides evolved effective ambulance systems and hospitals, procured adequate medical supplies, and developed well-trained surgeons. Yet it was not until the Battle of Gettysburg (July, 1863) that the Union forces were able to remove their wounded from the field at the end of each day’s fighting. It had taken two years of bloodshed and suffering to develop a good medical corps.
The man largely responsible for reforming the Union Medical Corps was Surgeon General William A. Hammond, a bright, energetic individual whose much-needed reforms irritated enough regular army officers and politicians to lead to his dismissal in November, 1863, and his subsequent court-martial. The South was more fortunate in that a capable and intelligent surgeon general, Dr. Samuel Preston Moore, took over at the beginning of the war and was able to develop a sound medical corps with a minimum of interference. The South, however, lacked an effective transportation system, did not have as many well-trained surgeons, and was hard put to provide adequate medical supplies. Despite these handicaps, Moore was able to tender southern troops medical care comparable to that given the Union forces. Not until the transportation system began to break down toward the end of the war did the South have any significant medical problems.
As the century began, France held the lead in medicine. Francois Magendie (1783-1855) was one of the early French leaders who painstakingly tried to keep his observations simple and free of speculation. He is especially remembered for his experimental proofs that the posterior roots of the spinal canal carry sensory nerve fibers (receiving impulses to the cord) and the anterior roots are motor nerves (transporting impulses away from the cord to the muscles). The priority of this accomplishment was challenged by Charles Bell, among others, and the principle is now called the “Bell-Magendie law.” Magendie’s analyses of the actions of drugs also made him a founder of the discipline of pharmacology. Never holding an academic position, he was nevertheless a typical investigator of the early nineteenth century in combining the careers of medical practitioner and laboratory experimentalist.
On the other hand, Claude Bernard (1813-78), the virtual founder of experimental physiology, was entirely a man of the laboratory. He further developed the precepts of his teacher Magendie, postulating questions that could be answered only through experimental vivisectional techniques, which he perfected into elegant experiments. One of his influential concepts was the principle of homeostasis, which stated that the “internal environment” is constant in warm-blooded organisms and that physiological mechanisms resist any external factors which tend to alter this internal state.
Among other exceptional accomplishments, Bernard clarified the multiple functions of the liver, studied the digestive activity of the pancreatic secretions and the association of the pancreas with diabetes, and pointed out the connection of the nervous system with the constriction and dilation of the smaller arteries. In his Introduction to the Study of Experimental Medicine in 1865, he set down the standards for future experimentalists. An imaginative thinker, he had a firm attachment to objectivity, as indicated by his admonition, “Put off your imagination as you take off your overcoat, when you enter the laboratory…”
Another French contributor to physiology who was a practitioner of medicine as well as an investigator, Charles Edouard Brown-Sequard (1817-94), the son of an American sea captain and a French mother, is sometimes considered the founder of endocrinology, although Bernard had actually opened the field. In the course of his many travels, he lectured in either French or English on a variety of topics which included some of his own discoveries. Brown-Sequard taught that the adrenals, thyroid, pancreas, liver, spleen, and kidneys had secretions (later to be called hormones) which entered the bloodstream and could be used in treatment. He also believed that injections of extracts of the testis would produce rejuvenation.
Germany’s role in medicine was in a large measure due to the influence of Johannes Peter Muller (1801-58), who started out as a romantic “nature-philosopher” but later developed a more objective view of biological functions. His work focused on morphology (structure) rather than experiment, but he inspired numerous students who were to contribute to knowledge in physiology.
Lacking laboratories and the communities of scholars essential for much of the work in the basic sciences, America contributed little to physiology and histology, but there was one notable exception. William Beaumont (1785-1853), an obscure army surgeon who had learned medicine through an apprenticeship, took advantage of a rare opportunity to pave the way for the present understanding of the gastric process. In 1822, while serving in Fort Michilimackinac in northern Michigan, he was called to treat a French Canadian who appeared to be dying from a shotgun blast at close range. The lower chest and abdomen were torn open and the left lung, diaphragm, and stomach were badly lacerated. The wound was filled with blood, bone splinters, lead shot, wadding, bits of clothing, and the contents of the stomach. Although it seemed obvious that the wound was fatal, Beaumont cleaned and dressed it, and tried to make his patient comfortable. Miraculously, after a lengthy convalescence, Alexis St. Martin, the patient, survived, but he was left with a permanent gastric fistula giving direct access to his stomach. Notwithstanding strong objections from St. Martin and other difficulties, Beaumont conducted a long series of experiments which he summarized in 1833 in his classic work, Experiments and Observations on the Gastric Juice and the Physiology of Digestion. While his achievement received only limited recognition in America, European scientists hailed it as a major accomplishment.
In England, many physiologists illuminated the functions of the nervous system in the nineteenth century. Marshall Hall (1790-1857) may be mentioned for his work on shock and his discovery that some reflexes could be elicited Without going through the higher centers. William Sharpey described the clearing action in mucous membranes by microscopic, mobile hairs, the cilia.
A far-reaching influence on physiology and on subsequent attitudes toward behavior came from the experiments on animals by Ivan Pavlov (1849-1936) in Moscow. After having studied in the laboratories of Ludwig and Heidenhain in Germany, he became professor of pharmacology and then of physiology at the Military Medical Academy in Russia. He made detailed investigations on the heart, liver, pancreas, and alimentary tract, but his most influential work was on the conditioned reflex. For instance, he showed that one could condition a dog to salivate and its stomach to secrete in response to an outside stimulus even in the absence of food, by repeatedly linking the stimulus to the providing of food.
Chemistry and Pharmacology
Just as correlations were made between bedside findings and changes in the organs, so too was the chemistry laboratory brought to bear on understanding the functional alterations caused by disease. By the middle of the nineteenth century, examinations of blood and urine were routine.
One of the most significant accomplishments was the synthesis by Friedrich Wohler (1800-82) of urea, a natural product of the body, from an inorganic compound, ammonium carbonate. After that, the separation between the organic and the inorganic was no longer distinct. Organic chemistry would become merely . the chemistry of carbon compounds. In applying the techniques of inorganic chemistry to the study of the chemical mechanisms of the body, Felix Hoppe-Seyler (1825-95) opened the field of physical chemistry. His discovery in 1862 of hemoglobin (the oxygen-carrying substance in the red cells of the blood) was a milestone in medicine.
As advances in physiology and chemistry proceeded, it became possible to isolate drugs in pure form and thus examine their actions in animals and humans. Thus the discipline of pharmacology was developed. On the basis of preliminary investigations in France by J. F. Derosne in 1803 and A. Seguin in 1804, F. W. A. Serturner in Germany isolated morphine in 1806. Pelletier and Caventou in 1818 did the same in France with strychnine, quinine, and other drugs. Pierre Robiquet was another of the many pharmacist-chemists in France and Germany who discovered and isolated the new plant alkaloids so important to medicine—among them atropine, colchicine, and cocaine.
Pharmacology became an independent subject for the first time through the efforts of Rudolph Buchheim (1820-79) in Dorpat and his pupil Oswald Schmiedeberg (1830-1920) in Strassburg. Schmiedeberg’s teachings on experimental pharmacology were brought to America by John J. Abel (1857-1938), who further enlarged upon the earlier activities of H. C. Wood and Silas Weir Mitchell of Philadelphia. Abel became head of materia medica and therapeutics at the University of Michigan in 1891, and later, at Johns Hopkins, he occupied the first chair in the United States in the new discipline of pharmacology. Other schools soon established departments of pharmacology, but the schools of pharmacy were relatively late in introducing this specialized discipline.
In England Alexander Crum Brown (1838-1922) and Thomas Frazer advanced the discipline by correlating the actions of drugs with their chemical composition. As more and more drugs were isolated and their chemical nature understood, it became possible to create therapeutic compounds by building them from basic units. Alkaloids and antipyretics (fever-lowering compounds) were among the first drugs synthesized.
Matthias Schleiden (1804-81) and Theodor Schwann (1810-82), the latter a student of Johannes Muller’s, developed one of the most important conceptions of modern biology. Although it had been previously known that parts of plants were cellular, Schleiden was the first to state explicitly that each plant was a community of eels with each cell having a separate existence. Schwann generalized Schleiden’s conclusions to all life—animal and plant. Much of the information that led to the understanding and elaboration of the cell theory depended on technical advances in microscopes. It was not until the 1830s when Amici and Chevalier produced the achromatic lens that the finer structure of cells could be examined.
Even when the idea was accepted that all living creatures are composed of living cells, the question arose of how the cells originated. Schleiden proposed that the cell and its components formed as a result of a chemical precipitation out of an undifferentiated mass. It was another student of Muller’s, Rudolf Virchow, who overthrew the speculative explanations and firmly established the proposition (in which he was joined by many other investigators) that cells arise only from preexisting cells. At first the view proposed by Schleiden was embraced by many outstanding scientists, including Karl Rokitansky, a great pathologist of the time, but the concept of Virchow finally gained full acceptance.
Microscopic Anatomy and Embryology
Among the pupils of Johannes Muller who contributed to an understanding of the microscopic structure of organs was Jacob Henle (1809-85), who was also responsible for early ideas concerning microorganisms as causes of disease. Albert von Kolliker (1817-1905) wrote what may have been the first organized textbook on histology. He explained the development of the embryo on the basis of the new cell theory. The spermatozoan had been known for centuries, but Karl Ernst von Baer (1792-1876) gave the first description of the ovum (egg) of mammals. Robert Remak (1815-65) classified tissues according to their embryological origin into three primary systems (germ layers): ectoderm, mesoderm, and entoderm. The mechanism of cell division, the means by which the embryo enlarges, organs increase, and tissues regenerate was reported by Walter Fleming in 1882. Wilhelm Waldeyer (1836-1921) named the chromosome in the nucleus of the cell and put forth in 1891 the theory that the basic unit of the nervous system is the nerve cell, the neurone. He also pointed out that cancerous growths arise from epithelial cells in the ectodermal tissue layer.
In keeping with the spirit of correlating the clinical manifestations of illness with the pathological findings in organs, autopsies were the major focus in medicine. In the French and British schools the availability of corpses for pathological examination was quite limited, whereas in Austria and Germany medical institutions were often the centers in which autopsies were performed, usually by one prosector and his pupils. Moreover, in France and Britain the principal pathological studies were made by the clinicians.
Carl Rokitansky (1804-78), a Czech who worked in Vienna at the Institute of Pathology, was an example of the nonpracticing physician, a type that was. to become common in the nineteenth century. In his day he was the most outstanding morphological pathologist in the world, performing together with his assistants almost 60,000 autopsies in less than fifty years. His classifications of the changes in organs produced by disease set standards acclaimed by all. However, Rokitansky’s reliance on humoral theories (he tried to reconcile the ancient concepts with modern anatomical knowledge) led to devastating criticisms by the young Virchow which shook his standing, but he remained an honored pathologic anatomist throughout his life.
Rudolf Virchow (1821-1902), one of Muller’s students in Germany, was called the “Pope” of medicine in Europe because of the eminence of his scientific influence. He strove to integrate clinical medicine, morbid anatomy, and physiology. His dictum, “all cells come from other cells,” radically altered the direction of medical thinking toward the concept that disease was produced by disturbances in the structure and function of the body cells. Such ideas had occurred before, but Virchow was so thorough in proofs and so convincing in argument that the medical world readily accepted his pronouncements. Henceforth the target of treatment would be the cell. Among his other major contributions were the discovery of the disease entity leukemia and his studies on the nature of thrombosis, embolism, and phlebitis, which promulgated principles still valid today.
Virchow’s indefatigable energy and voracious mind took him into a variety of fields besides pathology: anthropology, archaeology, history, politics, public health, and sociology. His zeal for reform encompassed proposing that social conditions were a primary culprit in epidemics and advocating a reorganization of medical education and licensure.
However, with all his extraordinary gifts, Virchow was not above human frailties. His attacks on Rokitansky and on Karl August Wunderlich, the influential German clinician, while soundly based, contained elements of personal acrimony. Nor did he readily accept the bacterial theory of disease, pointing out that the presence of a bacterium in a diseased area did not of itself signify that it was the cause of the disease. His emphasis on the reaction of the body’s cells to an invading organism rather than the organism itself is consonant with present views that the host’s response to a noxious agent—bacterial, viral, or chemical—is as significant as the invader. Yet, Virchow did make too little of the role of microorganisms, and in the frustrating battle for asepsis by Semmelweis (described later) Virchow did not give his support.
But throughout his long life Virchow’s scientific interests and influence never waned. By the time of his death, he had been acclaimed throughout the world. If medicine consists essentially of endeavors related to concepts of illness, methods of treatment, education and organization of the healing professions, and measures directed toward preventing disease and maintaining health, then Virchow was indeed the complete man of medicine.
CLINICAL SCHOOLS AND THE CLINICIANS
The outstanding characteristic of nineteenth-century medicine was the correlation of discoveries in the laboratory and autopsy room with observations at the bedside, and it was principally the hospital where such investigations and interconnections were pursued. In the first half of the century, leadership in clinical science resided in France, but it later passed to the British Isles, and then to the German-speaking countries.
An important factor in the emergence of Paris as the leading clinical school was the French Revolution. As the old regime was swept away, so were ancient ideas and inhibitions, opening the way to new approaches by experiment, an emphasis on pragmatism rather than theory, and bedside observation instead of reasoning by concept. The hospital became more important as the focus of medical activity, public health measures were seen as a duty of government, and medical practice was open to all classes. The wounds sustained in the savage turmoil of the Revolution and after had increased the need for surgery, and since physicians appeared to have little effect on illnesses and epidemics surgeons gained an equal status. Nevertheless, as surgery and medicine coalesced into one profession, specializations began to occur as the new discoveries accumulated.
Philippe Pinel (1745-1826) was representative of both the eighteenth and nineteenth centuries. His concern for the classification of disease was a holdover from the past, and his concentration on objective clinical study in a special field fitted the trend toward specialization. Pinel’s close observation of people with mental illness and his astute evaluation of the results of treatment led him to advocate a change in insane asylums from forcible restraint to gentleness, persuasion, and a cheerful environment which benefited from the influences of family and friends.
Rene-Theophile-Hyacinthe Laennec (1781-1826) was one of the greatest clinicians of all time. He made outstanding contributions to the pathological and clinical understanding of diseases of the chest—notably emphysema, bronchiectasis, and tuberculosis—but he is best remembered for his invention and use of the stethoscope.
Before Laennec, the sounds of the lungs and heart were studied by holding one’s ear against the chest, a technique with numerous disadvantages. After observing two children transmit scratching noises to each other’s ear via a wooden board, Laennec got the idea of rolling up a sheaf of papers to aid him in listening to a patient’s chest. His next step was to construct a wooden cylinder, and he was amazed to discern sounds he had never heard or hardly appreciated before. Using the new information, Laennec was able to illuminate the clinical picture of many diseases. His monaural stethoscope was further improved and eventually became the binaural device used today as a regular tool of every clinician.
As an adherent of the hated royalist cause, Laennec did not reach the popularity or influence that was achieved by his contemporary Francois-Joseph-Victor Broussais (1772-1838), whose physical vigor, brilliant personality, and devoted espousal of progressive social attitudes made him virtually the most influential physician in France. In agreeing with the Parisian school on the importance of matching the clinical picture with abnormalities (lesions) in the organs of the body, Broussais vigorously ridiculed attempts to classify diseases according to symptoms. In his view, proper treatment must focus on the pathologic changes in tissue, not on the outmoded doctrines of the humors. The remarkable numbers of intestinal lesions which he saw, especially of typhoid fever, led him to conclude that the gastrointestinal tract was the primary source of most illnesses, especially those with fever. Believing Brown’s doctrine that irritation was the basic property of living tissues, he considered changes in tissue heat the fundamental determinant of health and illness. From the congestion he saw in the intestines, he reasoned that disengorgement to reduce heat was necessary. He therefore used an accepted method of removing the plethora of blood: bloodletting. Instead of the cumbersome maneuver of venesection, he employed an easier means—applying leeches. So convincing were his teachings that the physicians of France imported in a single year over forty million leeches.
One of the most effective techniques of evaluating the efficacy of treatment is statistics, although circumspection and objectivity in collection and analysis are required. The very introduction of numerical analysis of bloodletting results by Pierre-Charles-Alexandre Louis (1787-1872) was not only a fatal blow to venesection but a major force leading to the scientific evaluation of all therapies.
Although Pierre Bretonneau (1778-1862) preceded Louis in establishing typhoid fever as a specific entity, Bretonneau’s complex term “dothienenteritis” was replaced by Louis’s simpler “typhoid.” On the other hand, Bretonneau’s “diphtheria” (diphtherie) had a more permanent place in history.
Among other outstanding names of the French school were Armand Trousseau (1801-67), who contributed masterful and perceptive treatises on illness; Jean-Baptiste Bouillaud (1796-1881) (evidently the model for Balzac’s Dr. Bianchon); Pierre-Adolphe Piorry, a pioneer in the use of an instrument for percussion of the chest (the pleximeter); Francois-Olive Rayer, a contributor to clinical and laboratory knowledge who inspired Bernard in physiology, and Davaine and Villemin in their work on infection. Guillaume B. A. Duchenne (1806-75) and Jean-Martin Charcot (1825-93) were the virtual founders of neurology in France. Duchenne, who started as a country practitioner, made use of the new electrical current reported by Michael Faraday (1791-1867) to treat patients with rheumatism and to study the actions of muscles. Charcot became world-famous through his clinical teachings at the Salpetriere Hospital of Paris where he made an extraordinary number of original contributions to different fields of medicine, most of them related to the nervous system. He was the eponym for several syndromes, including “Charcot’s joint” (the joint derangement of locomotor ataxia caused by syphilis). His reputation and his writings on hysteria and hypnotism attracted the young Freud in Vienna, who went to Paris specifically to observe the work of Charcot.
Surgery in Paris was comparatively well-developed, especially in the first half of the century, with much of the advance due to the bloody events of the Revolution and Napoleonic wars. Perhaps the most personally popular of all surgeons was Dominique-Jean Larrey (1766-1842), so highly respected and admired by friend and foe alike that Napoleon called him the most virtuous man he had ever known. Although Larrey was famous as a surgeon (he is said to have performed over two hundred amputations during a twenty-four-hour period in the Russian campaign) and also as a clinician (he wrote vivid descriptions of “trench foot,” scurvy, contagious eye infections, and a method of feeding through a stomach tube), perhaps his most influential contribution was the creation of “flying ambulances,” wagons for stretcher use during battle. Unheard of until then, Larrey’s vehicles and transport system went into operation as the action started, affording a tremendous boost to morale and a much greater opportunity for effective treatment. Moreover, his attention to the wounded of both sides in battle was, in a sense, the harbinger of the principles of the Red Cross, which was formed later in the century.
While Larrey was adored, his colleague Guillaume Dupuytren (1777-1835) was actively disliked, though still admired. A spellbinding lecturer, incisive bedside teacher, indefatigable worker, enormously successful practitioner, and brilliant contributor to surgical knowledge, he nevertheless alienated colleagues and acquaintances by his cold, abrasive manner and devious machinations against others. He was innovative in devising instruments and bold in performing daring operative feats with successful outcome. Pierre-Francois Percy (1754-1825), a surgical colleague, called him “the first of surgeons and the least of men.”
There were many other surgeons in Paris who were pioneer contributors: Recamier, who may have been the first to remove the uterus; Roux, the thyroid; Lisfranc, the rectum. Pravas introduced the syringe. Lembert contributed to intestinal surgery, Meniere to the medical and surgical aspects of ear diseases. Pierre-Paul Broca (1824-80) firmly established, on the basis of clinical and pathologic evidence, that the speech function is located in a distinct area of the brain, now called “Broca’s area.” He was also a pioneer in anthropology, a discipline opposed by the government and by churchmen who felt that the concept of the anatomical localization of the mind in the brain was too materialistic. Paris continued to be a great school for surgical teaching and learning throughout the century, although the principal centers of influence shifted in the later decades to Britain and the German-speaking countries.
Simultaneous with the great tradition that developed in Paris, a spirit of clinical investigation also arose in Dublin at Meath Hospital, both self-generated and influenced by Parisian principles. As in London, many of the clinicians in Dublin were Scottish in either origin or training. One may say that a strong Scottish thread ran through the fabric of medicine in England, Ireland, and the United States.
John Cheyne (1777-1836), although born and trained in Scotland where he published a book on diseases of children, made his greatest contributions in Dublin as an influential member of a group often referred to as the “Irish School.” His detailed accounts of a variety of diseases and his writings on education gained him a worldwide reputation as a great teacher and practitioner. The term “Cheyne-Stokes respiration,” a type of irregular breathing, has remained in medical parlance.
William Stokes (1804-78) was born in Dublin, but he too studied and wrote in Scotland (The Use of the Stethoscope) before returning to Dublin in his early twenties to become a highly popular lecturer and bedside teacher. Two of his books, Diseases of the Chest and Diseases of the Heart, were to become important standard texts for generations. In addition to the breathing abnormality which linked his name with Cheyne’s, there is a type of dysfunction of the heartbeat first described by Robert Adams in the same century called “Stokes-Adams” heart block. Although he mistakenly believed that typhoid and typhus fevers were the same disease, he understood and emphasized the importance of public health and preventive measures.
The most famous teacher of the Dublin group was Robert James Graves (1796-1853), whose bedside rounds were widely known as superb instructional exercises. He is the eponym (“Graves’ disease”) for that combination of thyroid enlargement, nervousness, sweating, and pronounced stare referred to as “toxic exophthalmic goiter.” It was Graves who overturned the past dietary restrictions for patients with fever by urging a full, nutritious diet for all ill patients. He suggested that his own epitaph could well read, “He fed fevers.”
Another influential clinician of the Dublin school was Dominic John Corrigan (1802-80). He is best remembered for his description of the pathologic cause and characteristic pulse, called “Corrigan’s pulse,” of a disease of the aortic valves of the heart. Abraham Colles (1773-1843) described so thoroughly the principles and methods of treating a fracture of the wrist that it has continued to be known as the “Colles fracture.”
London and Edinburgh
One of the most influential teaching institutions of the century, Guy’s Hospital and Medical School, built in 1724 through the sponsorship and financial support of Thomas Guy, a publisher and investor, became world-famous as a center for practice and study. The “great men of Guy’s”—Bright, Addison, Hodgkin, all physicians, and Cooper, a surgeon—were the leading lights of London and all were products of the Edinburgh Medical School.
One of the many innovations of Richard Bright (1789-1858) was the assignment of a special clinical ward with all the supporting facilities for the express purpose of studying one particular group of diseases, a forerunner of the subspecialty organization of the twentieth century. His masterful reports on the clinical and pathological nature of diseases of the kidneys led to the name “Bright’s disease.”
Perhaps the most imposing of the leaders at Guy’s was Thomas Addison (1793-1860), whose severe, pompous manner, precisely chosen words, and physically impressive appearance struck fear into students. His thorough examinations and perceptive analyses earned him the awestruck respect of his colleagues. Pernicious anemia and adrenal insufficiency are both still referred to as “Addison’s anemia” and “Addison’s disease of the adrenals.”
The third member of the medical triumvirate, Thomas Hodgkin (1798-1866), was a man of exceptional generosity and modesty. A practicing Quaker, he dressed in the characteristic clothing of the sect and devoted much of his time to charity, eventually leaving clinical activities to engage in philanthropic work, travel, and study. “Hodgkin’s disease,” a term introduced by Samuel Wilks in 1865, is a clinicopathologic syndrome described by Hodgkin in 1832 which is characterized by enlargements of the spleen and lymphatic system.
Another famous member of the staff at Guy’s Hospital was William Withey Gull (1816-90). He especially condemned prescriptions containing multiple drugs and composed many widely quoted epigrams: “Savages explain; science investigates.” “Nursing, sometimes a trade, sometimes a profession, ought to be a religion.”
Many other physicians made contributions in the early half of the century. James Parkinson (1755-1824), for instance, gained recognition for his description of a neurological disorder now known as “Parkinson’s disease,” but he was also an important writer on paleontology. John Hughlings Jackson (1835-1911) and William Richard Gowers (1845-1915) advanced further the idea that functions of the brain and spinal cord are localized in specific areas, as had been established in the Paris school.
The fourth of the “great men of Guy’s” was Astley Cooper (1768-1841). Bettany wrote, “No surgeon, before or since, has filled so large a space in the public eye.” Although his financial position was secure because of his wife’s fortune, he slaved day and night—examining, operating, studying, demonstrating, lecturing, dissecting, and writing. In each of his many endeavors he was brilliant, thorough, and communicative. An elegant, careful operator (before the days of ether anesthesia!), he took pains to enable his students to witness clearly every step of the procedure. “Cooper’s fascia” and “Cooper’s hernia” are only two of the conditions which are named for his work.
Since he had a passion for dissection and lost no opportunity to pursue his anatomical studies, he had dealings, out of necessity, with people who obtained corpses surreptitiously—the so-called “resurrectionists.” He helped to defend them with argument and money. Those who were imprisoned he rewarded by supporting their families. Despite the clear antagonism of the populace to “body-snatching,” Cooper remained unassailable in position and esteem. His life from beginning to end seemed to be an unbroken upward path of success.
Mention may be made of an entirely different fate that befell Robert Knox (1791-1862) in Edinburgh, a contemporary of Cooper’s and one of the most famous teachers of anatomy at the time. When the nefarious activities of Burke and Hare, who murdered to obtain the bodies sold for dissection, were brought to light, Knox was falsely accused of complicity. Although later completely exonerated, he was vilified by the people, and his prestige was so irretrievably lowered that he never again achieved a position of influence.
The surname Bell, important in nineteenth-century surgery, was held by two unrelated families of Scotland. Benjamin Bell (1749-1806), a prominent, popular, practicing surgeon in Edinburgh trained in Paris and also in London, wrote a multivolume system of surgery that rivaled the influential text by Heister. His sons George and Joseph, grandson Benjamin, and great-grandson Joseph continued the highly respected surgical tradition into the twentieth century.
The second separate family of Bells contained even more famous members in the persons of the brothers John (1763-1820) and Charles (1774-1842), who were among the leading surgeons of Britain. Although Charles Bell was a highly competent surgeon, he became better known as an anatomist—especially of the nervous system. The priority of his experiments proving the motor function of the anterior nerve roots emerging from the spinal canal and the sensory characteristics of the posterior roots has been challenged by scholars who suggest that Magendie in Paris gave the first definitive proofs. Bell did discover that the fifth cranial nerve has both sensory and motor bundles and that the seventh cranial nerve could produce a paralysis now called “Bell’s palsy.”
John Bell, who also wrote and illustrated books on anatomy, is best remembered for his writings on history, vascular surgery, and wounds, and he may have been the model for Conan Doyle’s arch-detective Sherlock Holmes. The bitter, personal controversies which surrounded John Bell, at one time forcing him to leave Edinburgh for London, were typical of the acrimonious feuds which involved other surgeons of the English-Scottish schools.
Robert Liston (1794-1847) was probably the most dexterous operator in England, introducing innovative techniques and successfully removing tumors deemed by others to be unresectable. He was also the first in Britain to employ ether anesthesia. His colleague and antagonist James Syme (1799-1870) developed procedures which permitted the excision of joints, thus preserving the limb from amputation. A contemporary later wrote of Syme, “He never wasted a word, or a drop of ink, or a drop of blood.”
Benjamin Brodie (1783-1862), William Fergusson (1808-77), and James Paget (1814-99) were also outstanding surgeons of London. Brodie, a versatile and generous man, was a busy practitioner, physiologist, philosophic writer, and medical statesman, but the immensely popular Fergusson may have exceeded him in variety of interests. Not only did he write a highly reputed system of surgery and a book on the history of anatomy and surgery but he also invented instruments, did expert carpentry and metalwork, played the violin well, fished, danced, and enthusiastically supported budding writers and students.
Of the three, however, Paget’s name is best known to succeeding generations, partly because of the eponymic “Paget’s disease” of the nipple (an eczema which heralds the presence of carcinoma) and “Paget’s disease” of the bones (a deformity produced by an error in calcium metabolism). He discovered the trichina infestation in human muscles, wrote brilliant essays, gave painstakingly prepared, eloquent lectures, and at St. Bartholomew’s Hospital gained the reputation of being the best surgical diagnostician in Britain, with fabulous prescience in deciding what operation to do. “Go to Paget to find out what is the matter, and then to Fergusson to have it cut out” was a popular saying.
James Young Simpson (1811-70), perhaps the most famous obstetrician and gynecologist in Britain, born, trained, and nurtured in Scotland, introduced chloroform as an anesthetic. In his quest for a more pleasant and more controllable agent than ether, he had tried a drug suggested to him by a Liverpool chemist which had just been named “chloroform” by J. B. Dumas in Paris, but it had been discovered earlier, in the 1830s, by each of three independent investigators: S. Guthrie in the United States, Soubeiran in France, and Liebig in Germany. One evening Simpson and friends inhaled the substance at home and found they all had been rendered unconscious. Impressed by its effectiveness and pleasant smell, he tried it for operations and deliveries. For the next half-century, chloroform was the most frequently used anesthetic in Great Britain. Simpson also made many other contributions to obstetrics and gynecology. Although he was a highly principled man, supporting women’s entrance into medicine, his personal animosity toward James Syme spilled over to include the teachings of Joseph Lister, Syme’s son-in-law.
In the German-speaking countries Naturphilosophie was still prominent at the same time as the scientific-minded were advancing the cause of observational medicine. If the British and French schools were skeptics in therapy, the Vienna school was virtually nihilistic, placing little if any reliance on drugs. The leading man in medicine in Vienna was Karl Rokitansky, but he was entirely a pathologist. The outstanding clinician and perhaps the most nihilistic of all was Joseph Skoda (1805-81), a pupil of Rokitansky’s. Skoda refined Laennec’s ausculatory and percussive techniques to explain the physical bases for the various sounds produced by pathological lesions in the chest, showing that it was not the disease itself but the alterations in the physical conditions in the organ which produced the properties of pitch and timbre. An objective evaluator of treatment, he saw little use for medications or active interference in sickness, even giving placebos to patients with pneumonia to demonstrate that the illness ran its course unaffected by any therapy. At the time, this nihilism was probably more beneficial than the bleedings, emetics, and purgings that were still part of medical treatment.
Ferdinand Hebra (1816-80), one of the first to specialize entirely in skin diseases, began his career on Skoda’s service for chest diseases but, with his teacher’s support, founded a division of dermatology. He based his classification on the gross and microscopic changes in the tissues instead of on symptomatology or on general disease categories. His treatment was therefore directed toward the local problem rather than abnormalities in the humors which were still considered the primary causes. He discovered that the destruction of the itch-mite parasite cured scabies, a condition he recognized as transmissible from person to person.
There were other exceptional leaders in Vienna, one of whom, Joseph Hyrtl (1810-94), originally from Germany, became a famous teacher and historian of anatomy. Perhaps the most noteworthy historical figure in Vienna was Ignaz Semmelweis, whose important contributions and tragic career are described in the chapter on infection.
The theorizing, mystical Naturphilosophie which enveloped scientific and medical thinking in Germany in the early part of the century gradually gave way to direct observation and experiment, with the establishment of laboratory studies on body functions led by Johannes Muller and his followers.
Meanwhile a clinical school was emerging all over Germany, not in just one or two cities; possibly this was because Germany at the beginning of the century was a conglomeration of divided, independent political units, with no one central city representing the focus of feeling or governmental organization. Johann Lukas Schonlein (1793-1864), erected a classification of diseases, as a Natural History school, which turned out to be so arbitrary and artificial that it did not outlast him. But his intensive use of percussion and auscultation, incisive lectures and clinical demonstrations, and emphasis on the newest methods, including examination of the blood and urine, made him a leader in clinical medicine. His name was given to a bleeding disease, “Schonlein’s purpura,” and to a parasitic fungus, Achorion (or Trichophyton) schonleini.
Hermann von Helmholtz (1821-94) was one of the great geniuses of medicine, who entered the profession only because a career in physics appeared to offer little chance of a livelihood. Eventually he did become a physicist and professor of physics in Berlin in 1871, but the thirty years spent in medical practice and investigation were never forgotten. “Medicine was once the intellectual home in which I grew up; and even the immigrant best understands and is best understood by his native land.”
Even while a young army surgeon he had maintained his interest in physics and mathematics, and in 1847 he published a treatise of far-reaching importance to physics and physiology, Uber die Erhaltung der Kraft (The Conservation of Energy), which formulated the law (also independently developed by J. R. Mayer in 1842) that although energy could be transformed into different forms its total amount was constant—whether in the universe or in a living organism. In his later years as a physicist Helmholtz also added to the knowledge of electrodynamics, and his assistant Heinrich Hertz (1857-94) discovered the waves on which the electromagnetic transmission of the twentieth century is based.
Helmholtz made his greatest impact on medicine through quantitative determinations in the physiology of sight, sound, and nerve impulses. Taking the original work of 1801 by Thomas Young, an English ophthalmologist, he confirmed and broadened the studies to develop an explanation of color vision, the Young-Helmholtz theory. Intent on trying to look inside the eye of a living person, he devised an instrument consisting essentially of a concave mirror with a hole in the center which shone light into the pupil and enabled the viewer to see the reflected image of the retina. Helmholtz reported “the great joy of being the first to see a living human retina.” From then on, abnormalities in the eye were open to the diagnostic gaze of the physician.
Another influential member of the German clinical school, Karl August Wunderlich (1815-77), was in large measure responsible for popularizing the thermometer in clinical practice. Although Wunderlich went too far in believing that each disease entity had its own characteristic fever graph, his studies on fever made practitioners realize how important the temperature curve was. Wunderlich also wrote extensively on his reactions to what he saw at various clinics abroad. It was he possibly more than anyone else who brought back to Germany many of the principles and methods of the Paris school. He emphasized the need for more intensive study of therapy, which had been virtually left out in the French and Austrian institutions.
The United States
Although American medicine throughout the nineteenth century continued to depend upon Western Europe for innovations, individual American physicians demonstrated both intelligence and initiative. Ephraim McDowell (1771-1830), a physician practicing on the Kentucky frontier, was confronted in 1809 by a patient suffering from a large ovarian cyst. Fortunately for the patient, McDowell was a well-trained practitioner who had studied at the University of Edinburgh. He informed his patient that an operation to remove the cyst was considered almost certain death, but if she were willing to travel the sixty miles to his office in Danville he would attempt the operation. The patient, in the midst of winter, made the long trek on horseback and placed herself in his hands. McDowell recorded that while she recited psalms, he opened her abdomen and removed a diseased ovary weighing almost twenty pounds. Twenty-five days after the operation the patient returned home to live for another thirty-one years. McDowell successfully removed several more ovarian cysts before publicizing his work and gaining international recognition.
Obstetrical and gynecological problems are common to all peoples, and this may explain why still another American physician, equally far removed from the centers of medical learning, also pioneered in this area. J. Marion Sims (1813-83), a Southerner, acquired his medical education at the Charleston Medical School and Jefferson Medical College in Philadelphia—an education, he later wrote, which taught him nothing about the practice of medicine. He subsequently began practicing in Alabama, where he discovered a penchant for surgery. In this capacity he was called in to help with a young slave girl who had been in labor for seventy-two hours. Using forceps he delivered the child, but the mother had suffered so much injury that she was left with an opening between the vagina and urinary bladder—a condition considered hopeless. Encountering other cases of this type, he determined to help them. He assembled several slave women suffering from this condition, vesicovaginal fistulas, and at his own expense began four years of experimentation.
He despaired of success until one day he made a startling discovery. While treating a middle-aged woman for a retroversion of the uterus, he remembered the advice of a professor to place the patient in a knee-elbow position and to push the uterus back into position by using one finger in the rectum and another in the vagina. Reluctant to add to his patient’s discomfort by introducing a finger into the rectum, he sought to correct the situation by means of two fingers inserted in the vagina. In the process of turning his hand, the womb seemed to disappear and the patient was suddenly relieved. As she rolled over on her side, she was embarrassed by an explosive sound of air. Sims realized that in turning his hand he had permitted the external air pressure to push the vagina back into normal position, and he could scarcely wait to get back to the hospital to apply his findings. He placed one of his fistula cases in the same position, opened the vagina, and heard the air rush in. He wrote later: “Introducing the bent handle of a spoon I saw everything, as no man had ever seen before. The fistula was as plain as the nose on a man’s face.” After the discovery of the knee-elbow (Sims’) position (later modified to a side position), he devised a special (Sims’) speculum and catheter, learned the value of silver sutures, and developed new surgical techniques which finally enabled him to restore his patients to health. In the process he laid the basis for the specialty of gynecology.
Many other American physicians and surgeons deserve mention: Dr. Philip Syng Physick (1768-1837), generally credited with establishing surgery as a specialty in America; Drs. Joseph and John Warren of Revolutionary War fame and their descendants who provided leadership in New England medicine for several generations; Daniel Drake (1785-1852), stormy petrel of American medical education; and Oliver Wendell Holmes (1809-94), poet, essayist, teacher, and medical practitioner. Holmes, best known as a literary figure, was the first to recognize the contagious nature of childbed or puerperal fever. Holmes’s observations antedate by four years those of Ignatz Semmelweis, the man generally credited with this discovery.
In the second half of the century, one of the most famous physicians was William Osler (1849-1920), who was born and educated in Canada and held professorships not only there but in England and the United States, notably at the Johns Hopkins Hospital and Medical School in Baltimore. Although he was a pragmatic practicing physician who made outstanding contributions to clinical medicine, his main influences were as a beloved teacher of a long line of pupils destined to make lasting contributions to medicine; as a writer of an encyclopedic medical text which was a standard for generations; and as the model of a cultured, articulate, insatiably curious, highly principled physician. He also gave a significant impetus to the study of the history of medicine in the United States.
METHODS OF TREATMENT
In the early years of the nineteenth century, the principal therapies open to European and American physicians were general regimens of diet, exercise, rest, baths and massage, bloodletting, scarification, cupping, blistering, sweating, emetics, purges, enemas, and fumigations. There were multitudes of plant and mineral drugs available, but only a few rested on sound physiological or even empiric foundations: quinine for malaria, digitalis for heart failure, colchicine for gout, and opiates for pain. Many physicians continued to use compounds of arsenic for such diverse complaints as intermittent fever, paralysis, epilepsy, edema, rickets, heart disease, cancer, skin ulcerations, parasites, indigestion, and general debility. Antimony, which had its heyday in a previous century, was also still much in use, possibly sometimes aiding patients with parasitic infestations. For the most part, leading European practitioners, as well as some in America, permitted illnesses to run their course without interference, for careful observers noted little benefit from the therapies available. On the other hand, others believed that “desperate diseases require desperate measures” and favored the use of drastic drugs and procedures. In the United States, by the 1830s and 1840s, the influence of “heroic” medicine (such as bloodletting and strong drugs) was mitigated somewhat by the Louisiana Purchase in 1803, which introduced a large French-speaking population into the U.S. including French physicians who generally preferred assisting nature to battling the disease. Their close contacts with the Paris clinical school also taught them (and educated physicians in the northeast) the advantages of correlating clinical diagnoses with pathological changes in the organs.
Claude Bernard wrote, “Systems do not exist in Nature but only in men’s minds.” Nevertheless, numerous systems of therapy and explanations for illness flourished in the nineteenth century, a few of which may be mentioned. Some now seem close to quackery, but generally these theories of disease and treatment were sincere attempts to reconcile the symptoms of an illness with current knowledge.
Perhaps the most influential system was homeopathy, a creation of Samuel Hahnemann (1755-1843) in Germany, which taught that drugs which produced symptoms in a person resembling those of a specific illness would cure the patient if used in smaller amounts. However, the homeopathic system used such infinitesimal doses that they could hardly have had any effect, and, furthermore, the homeopaths were uncritical in their evaluation of results. But while their methods may have denied patients the therapeutic benefits of the few available specifics like quinine and digitalis, the homeopaths did spare their patients the harm of bleeding and purging. The doctrine spread throughout the world and was especially popular in the United States, where schools of homeopathy were founded, notably in Philadelphia and New York. As newer knowledge in physiology, pharmacology, bacteriology, and pathology developed and as more useful therapeutic agents appeared, homeopathy lost much of its appeal.
Hydrotherapy, an all-purpose therapy, was based on the ancient concepts of the humors—the necessity for expelling excesses. Vincenz Priessnitz (1799-1851) the principal proponent, administered water in every conceivable way, but his regimen also included simple, nourishing food and exercise. This system, which achieved great popularity, led to the founding of hydropathic institutions in Europe and the United States. The opposite view—using only dry foods and substances—also had advocates, but they were few. The Thomsonians, who emphasized herbal medicines and steam baths, were one of a diverse group of practitioners—prominent especially in the U.S.—who stressed “Nature’s remedies and folk medicine.”
Another medical therapy, which arose in the eighteenth century but had a strong impact throughout the world in the following century, was cranioscopy. Also called phrenology, the doctrine was promulgated by Franz Joseph Gall (1758-1828), a clinician born in Germany and educated in France and Austria, who practiced and lectured in Paris for over twenty years. He taught that the shape and irregularities of the skull were projections of the underlying brain and consequently indications of a person’s mental characteristics—a conclusion with no basis in fact. Gall’s concept of localizing mental processes was a good idea, but his uncritical exaggerations carried it too far. However, the very notion that the brain is a composite of discrete but interrelated functions anatomically confined to specific areas, an old but incompletely realized concept, was a principle that was to become the basic tenet of brain physiology.
In the United States, Andrew Taylor Still (1828-1917), who had attended medical lectures in Kansas City, organized a doctrine of medicine in 1892 which he called osteopathy. Concluding that drugs were ineffective in producing cures, he set up a system with two basic tenets: the living human body contains within itself all the remedies necessary to protect against disease; the correct functioning of the body requires a proper alignment of the bones, muscles, and nerves. Considerable dispute developed between osteopaths and regular practitioners, but over the decades osteopathic physicians so modified the original principles that they became almost indistinguishable in their methods from traditional physicians. They took up drugs, accepted vaccines, and utilized surgery. Many schools of osteopathy in the United States now have virtually the same curricula, educational standards, and practices as the regular schools.
Another healing system which ascribes disease to derangements in structure and function of the vertebrae is chiropractic, founded in 1895 by Daniel D. Palmer (1845-1913), who had earlier practiced magnetic healing. Proper adjustments of the spinal column are supposed to cure the ailments of the internal organs—a doctrine physicians generally regard as without foundation. In 1968 the U.S. Secretary of Health, Education, and Welfare reported to Congress that the claims of chiropractic were invalid, were not subjected to research evaluation, and should not entitle its practitioners to be reimbursed under the Medicare law. Nevertheless, the U.S. National Center for Health Statistics estimated that in 1965-66 approximately two percent of the population consulted chiropractors for treatment of back problems and other ailments.
Another healing cult which is more religious than medical is Christian Science. In the eighteenth century, Phineas P. Quimby (1802-66), a mesmerist, attributed his cures to the faith of the patient. Mary Baker Eddy (1821-1910), one of his patients though not his direct disciple, in mid-nineteenth century founded the Christian Science church, which views health and recovery from disease as dependent entirely on following God’s divine laws. Confrontations between Christian Science and physicians have occurred when an operation or other treatment deemed necessary has been refused by a church adherent.
There were also numerous quack cults whose objective was to amass money by hoodwinking the public, eager as it was to find more convincing cures than were offered by orthodox physicians. James Morison’s “Hygeian” system in England held out the glowing prospect of a medical doctrine applicable to all types of illnesses—the cure of disease and the maintenance of health by freeing the blood of all impurities through the use of secret-formula pills (which later analysis showed to be a combination of strong laxatives). Although many reputable public figures within and outside the medical profession condemned Morison as a charlatan, and although newspapers lampooned the “Universal Pills,” Morison’s business thrived through widespread testimonial advertising and clever salesmanship in which the medical profession was castigated. Sales, which spread into France, the United States, Germany, and other countries, continued through the nineteenth century even after Morison’s death in 1840. Even exposure of the fraud in notorious court cases failed to dampen the enthusiastic embrace of the public, which sent several petitions to Parliament containing ten to twenty thousand signatures condemning orthodox medicines and extolling the virtues of Morison.
Another cure-all of great popularity was “Dr. James’s Fever Powder,” which was developed in the eighteenth century and still used into the twentieth. Its principal ingredient was antimony. The good reputation of Dr. James and his apparently sincere belief in the efficacy of his nostrum, together with the extravagant promotional activities by James and the bookseller John Newbery, succeeded in spreading the powder’s fame.
Since there was virtually no regulation of secret nostrums in most countries, the popularity of patent medicines depended entirely on the effectiveness of their advertising. Strong opposition to self-medication and proprietary drugs did not develop until the twentieth century. Indeed some nineteenth-century preparations were introduced by physicians themselves, and the government even allowed advertising on the very tax stamps levied on these products.
Surgery made steps forward very slowly, limited as it was by lack of effective pain control during operations and by devastating postoperative infections. Both of these obstacles were substantially lifted by the discovery of anesthesia and the proof that germs caused infection.
Although effective anesthesia was first discovered and put to surgical use in the United States, soporific, narcotic, and analgesic agents such as opiates and plants containing hyoscyamus and mandragora had been put to such use for thousands of years. Alcohol also had been resorted to for centuries to make a patient oblivious enough to pain to permit surgical procedures on the surface of the body or on the bones. Abdominal operations, including Caesarean section, were indeed performed at various times and places, but the systematic invasion of body cavities and internal systems was not feasible until the patient could be put to sleep deeply and safely enough to permit unhurried operative maneuvers.
In 1772, Joseph Priestley discovered nitrous oxide gas. Later, whiffs of nitrous oxide (soon called “laughing gas”) were indulged in at “revels” for social amusement and the euphoria produced. Noting a reduced sensitivity to pain in these “revelers,” Humphry Davy (1778-1829) suggested that “laughing gas” might be useful to surgery, but no one followed up his suggestion.
Other means of preventing pain through the loss of consciousness were also put forth from time to time. Henry Hill Hickman in 1824 produced a state of “suspended animation” in animals through asphyxia achieved by inhalation of carbon dioxide, which permitted him to perform operations without causing pain. He recommended this technique for use on humans but could not convince scientists.
Mesmerism, or “animal magnetism” (although branded quackery it was an early form of hypnotism), also played a part in opening minds to the possibilities of making people insensitive to pain. Although James Esdaile in India, stimulated by the publications of John Eliotson, performed seventy-three painless operations of different types using mesmerism, the medical profession worldwide remained unconvinced. Indeed, upon John Eliotson (1791-1868), the principal advocate of mesmerism, the brunt of denunciation fell. The hostile reception that his demonstrations and writings received led to his virtual ostracism. A well-trained, energetic investigator and practitioner, he seems always to have been eager to embrace new ideas, though sometimes with insufficient critical evaluation. For instance, his vigorous espousal of phrenology was one of the reasons for opposition to his reports. On the other hand he had been among the first to take up Laennec’s stethoscope, a step so unusual at the time that it also counted against him among his colleagues.
Unrecognized as a psychophysiological phenomenon (James Braid introduced the term “hypnotism” in 1843) and therefore misinterpreted by both proponents and opponents alike, mesmerism occupied the attention of doctors and the public for years. When the mesmerists learned of ether anesthesia they applauded its discovery, claiming that their own contributions had prepared the minds of the time to accept a sleep-induced state for operation. In England, Liston’s remark on using ether for the first time, “This Yankee dodge beats mesmerism hollow,” indicates that mesmerism’s analgesic effects had been implicitly realized even by the antimesmerists.
As anatomical knowledge and surgical techniques improved, the search for safe methods to prevent pain became even more pressing. The advent of professional dentistry added a new urgency to this quest because of the sensitivity of mouth and gums. Although death as an alternative frequently drove patients to the surgeon, few people were known to die from toothache. The urge to see a dentist was easily resisted, so it may be more than coincidence that dentists seized the initiative in the quest for freedom from pain.
By 1831 all three basic anesthetic agents—ether, nitrous oxide gas, and chloroform—had been discovered, but no medical applications of their pain-relieving properties had been made. In all likelihood the first man to apply his social experiences with laughing gas to surgery was Dr. Crawford W. Long (1815-78) of Georgia. In 1842 he performed three minor surgical procedures using sulfuric ether. Apparently not realizing the significance of what he had done, Long made no effort to publicize his discovery until several years later when anesthesia had been hailed as a major breakthrough.
A Connecticut dentist, Dr. Horace Wells (1815-48), on learning of the peculiar properties of nitrous oxide in 1844, tested them by having one of his own teeth removed while under the influence of the gas. Delighted with the results, he administered it to several patients, and then demonstrated his procedure before Dr. John C. Warren’s medical class at Harvard. For some inexplicable reason, the patient cried out, and Wells was booed and hissed. Following Wells’s failure, his friend and fellow dentist William T. G. Morton (1819-68) began experimenting with sulfuric ether. Encouraged by its effectiveness in his dental practice, he, too, contacted Dr. Warren and in 1846 gave the first public demonstration of surgery without pain. News of this momentous event spread rapidly throughout the Western world, and a new era for surgery began. Until Oliver Wendell Holmes supplied the name “anesthesia,” the Boston medical community had been at a loss for a term to describe the condition brought on by this new agent.
After ether was widely accepted, James Simpson in Edinburgh abandoned it for chloroform because of its disagreeable odor, irritating properties, and long induction period. For about a century, chloroform continued to be the choice agent in Britain until its unmanageable toxicity and delayed damage to the liver was appreciated. In Germany, even when in 1894 the superior safety of ether over chloroform had been clearly shown (a more than five times higher mortality for chloroform), chloroform remained the favored anesthetic for almost twenty-five years.
In Britain, Simpson’s advocacy of anesthesia in childbirth was vehemently condemned by the Calvinist church fathers as contrary to the Biblical admonition that a woman must bring forth her child in pain. However, the employment of chloroform by John Snow (1813-58) for Queen Victoria during her delivery helped disarm the opponents. The development of anesthesiology as a specialty of medicine owes much to Snow, who devised techniques and analyzed the physiological effects of different agents.
Ether was taken up by many other countries shortly after its introduction: notably France, Sweden, Portugal, Spain, Cuba, and South America. Even in Germany, where chloroform held first position, some preferred ether. Johann Friedrich Dieffenbach (1795-1847), a pioneer in plastic surgery, wrote, “The wonderful dream that pain has been taken away from us has become reality. Pain, the highest consciousness of our earthly existence, the most distinct sensation of the imperfection of our body, must bow before the power of the human mind, before the power of ether vapor.”
Other anesthetic agents were introduced near the end of the century. Ethyl chloride was sprayed locally to induce insensitivity. Cocaine by topical application to the eye was reported by Carl Koller in 1884. Sigmund Freud had earlier studied the anesthetic properties of cocaine but did not pursue the work. The injection of cocaine into nerve trunks to block sensation was investigated by William Halsted in the United States. Cocaine was also the first drug injected into the spinal canal in 1898 to produce anesthesia, but once its dangers were realized other less toxic and nonhabituating agents were developed. Numerous methods of administering anesthetics were tried, and the rectal route was introduced by Pirogov in Russia. Ore of France originated the intravenous method in 1874. After Fischer in 1902 had synthesized veronal, this barbiturate and other safer and more manageable agents for intravenous use were developed.
The “open” method of dripping the anesthetic on a gauze mask was replaced by “closed” systems in which an airtight mask could deliver a precisely measured amount of vapor and remove the exhaled carbon dioxide through absorption by a calcium compound. Advantages were also perceived in the insertion of tubing through the mouth and voice box into the trachea, thereby preventing the aspiration of secretions and controlling the patient’s respiration. The twentieth century saw refinements in endotracheal anesthesia which permitted an anesthetist to control the flow of air, oxygen, and other gases into the lungs and thus have complete mastery over breathing during an operation. Muscle-relaxing drugs were also put to use in placing the anesthetist in control of respiratory movements and the surgeon in a position to perform manipulations through a totally relaxed abdominal wall.
At first, physicians and surgeons administered anesthesia in addition to their own specialties. As techniques became more complex and knowledge increased, special nurses and technicians were assigned the task. Even well into the 1940s many highly reputable hospitals continued to employ nurse-anesthetists rather than physicians specializing in anesthesia. In 1935 Frank Hoeffer McMechan, supported by his wife Laurette Van Varsevold McMechan, spoke for anesthesiology: “The safety of the patient demands that the anesthetist be able to treat every complication that may arise from the anesthetic itself by the use of methods of treatment that may be indicated. The medical anesthetist can do this, the technician cannot.”
When anesthesia had become commonplace and the limitations of pain had disappeared, surgical procedures multiplied in number and complexity. No longer did the operator have to place the first emphasis on speed and to limit his manipulations mainly to surface areas of the body and the skeletal system. Yet the potential benefits of surgery were overshadowed by the frequent, devastating infections which often resulted in death. Outstanding surgeons everywhere were continually plagued by the dread complications of postoperative purulent infection and gangrene. Only when the bacterial origin of disease had been discovered and the necessity for keeping germs away from the operative field had been proved, notably by Lister, could surgery enter with safety the interior regions of the body. Every country participated in the new age of surgical progress, but the German-speaking countries were early at the forefront.
In the late nineteenth century, perhaps the outstanding surgical innovator in Europe was Albert Christian Theodor Billroth (1829-94). Born a German and educated in Berlin, he made his principal contributions in Zurich and, especially, in Vienna, where he was the first to successfully perform extensive operations on the pharynx, larynx, and stomach. Billroth’s honest, forthright nature was shown by his unprejudiced reports of results, good and bad, a practice he insisted on for all of his staff His teaching abilities, prominence as a writer on surgery, and personal influence were such that his students filled many of the prestigious chairs of surgery in Europe. His General Surgical Pathology and Therapeutics went through eleven editions, and his History of the German Universities, a book-length treatise on almost all aspects of medical education, set down the ideal tenets toward which schools in Europe and the United States aspired.
Throughout the world, the abdomen, neck, chest, cranial cavity, and spinal cord became common sites for surgical therapy. For instance, operations on the esophagus, stomach, and intestines—heretofore seldom dealt with effectively—were enlarged in scope and refined in technique, especially by the group surrounding Billroth. The nature of appendicitis, one of the most frequent surgical ailments, was elucidated only in 1886 when Reginald Heber Fitz (1843-1913) of Boston described the clinicopathologic entity formerly referred to as “typhlitis.” In 1878 the gallbladder was opened by J. Marion Sims (1813-83), a founder of modern gynecology. The approaches to tumors of the brain and spinal cord by Victor Horsley (1857-1916) in England gave impetus to neurological surgery. Newer instruments and techniques were developed by Koeberle, Nan, and Lembert. Ruge introduced the frozen section method of quick pathological examination. The older, standard procedures, such as hernia repair, were modified by Bassini and others to obtain better results. Plastic surgery was improved by Dieffenbach and Thiersch. For every organ and every region, a roster of names could be assembled of surgeons in the nineteenth century who made outstanding contributions.
Especially notable were the advances in operative treatment of the reproductive organs of women. The pioneer work of Ephraim McDowell in 1799 and of J. Marion Sims in 1852 in the United States has already been described. In Europe, Thomas Spencer Wells in 1858, Robert Lawson Tait in 1871, and W. A. Freund in 1878 developed operative procedures on the ovaries, Fallopian tubes, and the uterus. Removal of the baby by Caesarean section became more efficient and safe through the techniques of Porro in 1876 and Saenger in 1882.
So many were the innovations and so far was the domain of surgery extended that by World War I most of the basic operative procedures performed today (with the principal exceptions of thoracic and cardiac surgery) had already been developed. For the most part, the remarkable achievements of surgery in recent decades have been due to increases in physiological understanding, the introduction of safe methods of blood transfusion, the production of antimicrobials, and the improved management of the patient before, during, and after operation.
In the early half of the century, advances in physiology, pathology, and chemistry were not reflected in medical practice, for the physician’s equipment was still limited. Doctors were even considered useless or harmful by large segments of the public conditioned by the failure of bleedings, purgings, and other manipulations to affect illness or stem epidemics and by the extravagant but convincing claims and cures promised by quacks. Attacks on nostrums and patent medicines were unpopular and generally ignored.
A dichotomy existed, especially in England, between those who favored mandatory licensing control over all healers, including physicians, and those who strongly advocated allowing anyone to practice medicine, giving patients a choice from among many practitioners and claimants. Political progressives believed that regulation would lead to domination and self-serving restriction of others by the medical profession; conservatives preached that only official bodies could or should determine who was fit to treat people.
Education and Licensure
The nineteenth century saw the establishment of more uniform educational and licensure requirements, but even in ancient times there had been some official supervision and rules for medical practice. The certification ordered by Roger II of Sicily in the twelfth century was expanded by Frederic II in the thirteenth century to comprise a nine-year curriculum, an organized system of state licensing examinations, a mechanism for regulating apothecaries, and a sanctioned schedule of fees. Spain and Germany followed with rules of licensure shortly afterward. In 1511, Parliament, during the reign of Henry VIII, created a certifying board which continued to function for about three hundred years.
By the eighteenth century in England, medical education was entirely in the hands of individual doctors, mostly but not exclusively surgeons, who had their own private schools which dealt principally with anatomy and surgery until other subjects were later added. Although the teachers, such as the Hunter brothers, often imparted a high order of instruction, the students received their clinical education by walking around the wards observing the leaders in the great institutions of London: St. Bartholomew’s, St. Thomas’s, St. George’s, Guy’s, London, and Middlesex hospitals. In contrast, Edinburgh had a regular medical school, operational since 1736, with formal courses of instruction which included regular lectures and bedside teaching.
Attempts to set up adequate certifying bodies met considerable difficulty. At one time there were three separate medical councils (for England, Scotland, and Wales), and the General Council of Medical Education of 1858 was created to try to produce order in the certifying process. A coordinating body was finally formed by the end of the nineteenth century.
When the nineteenth century dawned, America had only four small medical schools to supply physicians for its burgeoning population, compelling most doctors to acquire their training by apprenticeship. In 1807 the University of Maryland Medical School was organized by a small group of Baltimore physicians as a private venture, and in succeeding years dozens of these proprietary medical schools came into existence. Three or four physicians would apply for a state charter, rent or buy a building, and begin advertising for students. The school year ordinarily lasted from eight to fourteen weeks, and the course work consisted exclusively of listening to lectures. Many proprietary schools granted degrees after one academic year, although they usually required the student to have served a one- or two-year apprenticeship prior to admission. Since these schools were dependent upon student fees for income, few applicants were ever turned down and even fewer failed to graduate. At the initial meeting of the American Medical Association a committee was appointed to examine medical education, and one of its proposals was to lengthen the school year to six months. When the University of Pennsylvania and the College of Physicians and Surgeons in New York followed the recommendation, their enrollments fell drastically, and the lesson was not lost on other schools.
Nearly all efforts to reform medical education foundered on this same rock. Those institutions which raised entrance requirements, lengthened the school year, or increased the amount of course work invariably found themselves losing students to schools with easier requirements. Despite pioneering efforts by Harvard, Michigan, and other schools, it was the end of the nineteenth century before the level of medical education was raised appreciably.
In an effort to bring a measure of unity into the profession, local and state medical societies had gradually come into existence, and these in turn led to the formation of the American Medical Association in 1847. While this organization did not become an effective force until the end of the nineteenth century, it was a strong advocate of improved medical education, fought to establish a code of medical ethics, promoted public health measures, and generally sought to improve the professional status of physicians. While the appearance of the A.M.A. boded well for the future, the public image of the American medical profession as of 1850 was at its nadir.
As the century drew on, a conjunction of circumstances moved American medicine toward professionalization. The most important of these were the fundamental developments in medicine itself. By 1900 the major outlines of human physiology were understood, the role of pathogenic organisms and their vectors was explained, and medicine could operate from a reasonably factual basis. A second factor was the rising American standard of living which brought with it a broadening of education at all levels, and medical schools could scarcely remain untouched.
The first medical school to lead the reform movement was associated with Lind University in Chicago (later Chicago Medical College and presently Northwestern University). In 1859 Lind raised its entrance requirements and lengthened its academic year to five months. The school received no support in its fight to raise educational standards until 1871, when Harvard overhauled its medical school and instituted a three-year graded course, a nine-month academic year, and written and oral examinations. Despite a better than forty percent drop in enrollment, Harvard persisted, and within a few years Pennsylvania, Syracuse, and Michigan swung into line.
The next major step came with the establishment in 1893 of The Johns Hopkins University School of Medicine, which assembled a remarkable faculty headed by William H. Welch and William Osler. Welch, a pathologist, was among the first to introduce microscopy and bacteriology into the United States, and Osler was a firm advocate of more bedside training for medical students. Under the guidance of these two, assisted by William S. Halsted and other outstanding professors, Hopkins drastically reshaped American medical education and set a pattern which persists today. From its inception, Hopkins required a college degree as a prerequisite for admission, provided a four-year graded curriculum, made extensive use of laboratories for teaching purposes, and integrated the hospital and college facilities to provide clinical training to advanced students.
Hopkins flourished, and within a few years its former students and professors were carrying the Hopkins system to all parts of the United States. Two other steps were still needed to place medical education upon a sound basis. In 1904 the A.M.A. created a permanent committee on education, which two years later became the A.M.A. Council on Medical Education. The council immediately began evaluating schools in terms of the ability of their graduates to pass licensing board examinations. However, the council was too closely identified with medicine, and its members recognized the need for a more objective evaluation. This was achieved by persuading the Carnegie Foundation for the Advancement of Teaching to undertake the task. The foundation employed Abraham Flexner, a man who had already studied American higher education, to survey the field, and the report which ensued was a damning indictment of medical education. More important, the Flexner Report (1910) brought foundation money to the better schools, and, by improving them, forced the weaker ones out of business. In the meantime the Council on Education had begun to classify schools on an A, B, C basis, evaluations which played a key role in standardizing medical education.
In France, the decrees of Napoleon in 1803 categorized those who could practice medicine into doctors of medicine, doctors of surgery, and health officer doctors, each division with its own educational prerequisites and licensing examinations. Schools for apothecaries were built and a system ordered for inspecting the shops of apothecaries, druggists, and spicers. Tuition at all of the four state medical schools was kept low to permit students of limited means to enter the medical profession.
In Germany, the regulations varied in the different principalities. In the Duchy of Nassau, for instance, before it was taken over by Prussia, the physicians and surgeons were in one body under the state, and although strict examinations had to be passed to practice medicine a university degree was not essential. In Prussia, in 1825, three classes of licensed doctors were recognized: graduate physicians (who had to spend four years at a university and pass rigorous state examinations—including an additional test for those who entered surgery); wound doctors, first class (with fewer years of schooling and less difficult examinations); and wound doctors, second class (with even less education and less rigorous examinations). Obstetricians, ophthalmologists, and public health doctors also had separate requirements.
State practice of medicine and social insurance were also seen in the German principalities, where the physicians were paid by the state but were also permitted some private practice. In Prussia, the proportion of doctors who depended on state stipends became less and less. Bismarck finally turned to medical and social insurance as a means of receiving the support of the general populace in his aim of unifying Germany.
In Russia, after 1864, local governmental organizations, the zemstvos, were responsible for medical service to the poor and mentally ill and acted as public health overseers. The feldsher, a combination of male trained nurse and pharmacist who went out into the countryside, was also a provider of health care. Regular physicians continued to be trained in the large city universities.
Specialization in the nineteenth century was at first vehemently opposed by many in the profession who felt that it would be detrimental to the patient. Examples from the past of itinerant charlatans who specialized in pulling teeth, cutting for the stone, or treating only one kind of illness (for instance, venereal disease) caused ethical practitioners, and many lay people also, to regard with suspicion any physician who established himself to treat one group of diseases or one organ system. It smacked too much of the tradesman. Nevertheless, as the pressures of scientific, social, and economic factors became irresistible, specialization became an accepted fact. As medical information grew to be voluminous and new techniques became more complex, one practitioner could not encompass all. The patient was urged to seek a physician who devoted his time and skill to one type of illness or manipulation. Also, the opportunity for commanding higher fees, working less onerous hours, and receiving greater respect were all strong incentives to doctors to specialize. Moreover, the increasingly significant industrial principle of the division of labor also seemed to encourage the compartmentalization of medicine. In some instances the spur was principally the enormous increase in information (as in pathology), while in others it was the newly devised instruments which required special experience (as in urology and laryngology). Another factor was the abandonment of humoral ideas of general disease in favor of a focus on local organs in diagnosis and treatment.
Some examples may be cited. The invention of the head mirror by the country practitioner Adam Politzer in Vienna in 1841 aided specialization on the ear. In Britain, the first surgeon for ear diseases was James Yearsley, who founded a hospital in mid-century devoted entirely to the ear. William Wilde (1815-76), Oscar Wilde’s father, helped to establish in Dublin the St. Mark’s Hospital for the ear and eye. Operation on the mastoid for infection, which became a common procedure for many decades, was brought into otology by Hermann Schwartze in the 1870s. The first hospital in England specializing in the throat was a contribution of Morrell Mackenzie (1837-92). In the United States the organization of the Metropolitan Throat Hospital and The New York Laryngoscopic Society, both in 1873, were due to the efforts of Clinton Wagner.
Diseases of the eye, ear, nose, and throat were at first combined in one specialty. The first professor of ophthalmology was Joseph Baer in 1812 in Vienna, although a special dispensary for the eye was formed in 1805 in England. The ophthalmoscope invented by Helmholtz in 1851 was an incentive to specialization, as were the refractive principles of Donders and the surgical contribution of Von Graefe.
The itinerant, irregular bladder stone removers of ancient and medieval times were in a sense early urological specialists. The invention of instruments which could be passed into the bladder for observation gave impetus to the specialty. Nitze and Leiter in Germany, by improving earlier inadequate devices, constructed the first practical cystoscope. Since this was before the invention of the electric light bulb, the light source was an exposed platinum wire lit by electric current. After X-rays were introduced by Wilhelm Konrad Roentgen (1845-1923), it took until the 1920s before a feasible technique could be devised for adequately visualizing the urological tract. The intravenous method reported by Swick in 1929 was the forerunner of the later sophisticated angiography (injecting radiopaque dyes into the bloodstream to make the vascular system visible in X-rays). Much of urology was done by general practitioners and surgeons in the nineteenth century. Even in the 1930s, outstanding hospitals and teaching institutions still combined urology and general surgery in the same department.
The spirit of the Enlightenment of the eighteenth century and Rousseau’s writings were among the incentives to concentrate on the problems of children. Nils von Rosenstein, George Armstrong, and William Cadogan were pioneers in this specialty. Charles Billard in France and Charles West in Britain were important contributors of the nineteenth century. In the United States Abraham Jacobi, fleeing from Germany because of his espousal of the political and social reforms of 1848, soon found himself giving most of his attention to children’s diseases and influencing others to do the same.
Scientific dermatology had its beginnings in Hebra’s work in the New Vienna school, but Lorry, Alibert, and Willan had taken the earlier steps. Syphilis was an important part of dermatologic practice until well into the twentieth century, when its protean manifestations brought it into internal medicine. Philippe Ricord and Jean-Alfred Fournier clarified the clinical nature of syphilis and separated it from other venereal diseases.
Neurology was relatively late in becoming a separate specialty, and then it was often combined with psychiatry. Neuropsychiatrist was a common title after Pinel. Psychiatrists such as Janet, Esquirol, Bayle, and Georget gave France the leadership until the reports of Griesinger and others drew attention to Germany. Emil Kraepelin’s classification of mental disease into dementia praecox, manic-depressive psychosis, and paranoia was useful to the new specialty.
In the nineteenth and twentieth centuries, specialties and subspecialties became more and more numerous, so that now there is virtually no general branch of medicine or surgery without its subdivisions of specialization.
Pharmacy has been a part of medical practice throughout the centuries. The physician frequently compounded and dispensed drugs in addition to practicing medicine, and the apothecary often engaged in medical practice as well as compounding and dispensing. Rivalry between the two groups, which was intense in the seventeenth century, continued into the nineteenth century. The respective roles of the physician and the apothecary or pharmacist gradually became clearer, but in some countries, notably the United States in the nineteenth century, the physician continued to prepare and sell medications out of economic necessity.
The social position of the pharmacist in most places was high, and educational requirements after the seventeenth century became more and more rigorous, especially in Italy. In France the new standards grew to include a university education, special training internships, and even specialized certifications for clinical laboratory analysis, community practice, or industrial pharmacy. In Germany, where the pharmacist seems virtually always to have occupied a high social and professional position, the apprenticeship system evolved into an elaborate progression of examinations leading to a stratification by educational accomplishment.
The pharmacist in recent years, especially in the U.S., is becoming primarily a merchant and dispenser of medicines, owing to economics and the decreasing need for the compounding of prescriptions.
Lists of drugs to guide therapeutics have existed since ancient times, but the word pharmacopoeia (which means the making of medical substances) was first applied to such a listing in the sixteenth century. However, it was not until the nineteenth century that national pharmacopoeias were developed: Prussia in 1799, Austria in 1812, France in 1818, United States in 1820, Britain in 1864, and Germany in 1872. Many of these standard listings continued for a long time to include some of the bizarre, ancient substances combined in multi-ingredient formulas. For instance, theriac was still in the pharmacopoeia of London in the eighteenth century. Therapeutic agents in practice frequently did not keep pace with advances in general science, biology, physiology, and chemistry.
Dentistry really began its professionalization as an independent discipline with the work of Pierre Fauchard (1678-1761), who was the first clearly to devote full time to the teeth. He collated the considerable body of information that had accumulated through the centuries and described the use of tin and lead for filling cavities, but more importantly he established the ethical principle that secret methods should be openly reported in detail so that the results could be evaluated and used by others. Fauchard also emphasized the need for special training of doctors of the teeth and for the examination of candidates by those experienced in the discipline instead of by surgeons. His The Surgeon Dentist (1728), which became the authoritative text for generations, was the foundation of subsequent dentistry. Writings by others in France followed rapidly: Devaux (who also collaborated with Fauchard), Gerauldy, Bienn, Mouton (who constructed the first gold crowns and other new prostheses), Bourdet (who devised new instruments), and many others. Duchateau, an apothecary in the region of Sevres, molded the first porcelain dentures.
In Germany, incidental dissertations on the teeth by physicians and surgeons were replaced by reports from specialists such as the dentist to Frederick the Great, Philipp Pfaff, who in 1755 described how to make plaster models from impressions in wax. The craftsmen (usually woodworkers) who actually fashioned the prostheses designed by Adam Brunner were the forerunners of dental technicians.
Dentistry gradually became a separate specialty in other countries too, but it was in the United States especially that dentistry reached its fullest development in the nineteenth century and afterward, largely due to the efforts of Horace H. Hayden (1768-1844) and Chapin C. Harris (1809-60). The introduction of anesthesia by dentists was as important to dental procedures as it was to the surgery of other organs.
The first dental school in the world was established in 1839 as the Baltimore College of Dental Surgery. In 1870, although there were 10,000 dentists in the United States, only 1,000 were graduates of a school.
Advances in prostheses, such as the production of vulcanite in 1855 by Charles Goodyear, technical innovations in the management of cavities, improvements in the correction of occlusive derangements, and the elevation of educational standards gave American dentistry world leadership.
Eventually the specialization of dentistry, with its complex techniques, became so complete that it was separated from medical practice. However, in recent decades, the physiology and surgery of the head, neck, and mouth have brought a greater interdependency among physicians, surgeons, and dentists.
Since nursing only became fully established as a profession in the nineteenth and twentieth centuries, we are accustomed to regard nursing care in earlier centuries as rudimentary and unstructured. Yet in India, hundreds of years before Christ, Charaka had summarized four qualifications for a nurse: “knowledge of the manner in which drugs should be prepared or compounded for administration, cleverness, devotion to the patient waited upon, and purity (both of mind and body).” We are also apt to think of nurses as exclusively women, but throughout history males also have attended to the sick in hospitals. During the Crusades, the Hospitalers of St. John, the Teutonic Knights, and the Knights of St. Lazarus performed nursing duties, and male members of the mendicant orders of St. Dominic (the black friars) and St. Francis (the gray friars) also acted as nurses in the Middle Ages.
Nevertheless, women have been the principal performers of nursing duties in every period and every country. The nuns of religious orders, such as the Poor Clares, and secular groups with religious purposes, such as the Tertiaries of St. Francis and the Beguines of Flanders, carried on most of the nursing in medieval and even later times. Perhaps the oldest religious group devoted entirely to nursing was the order of Augustinian Nuns in the Hotel-Dieu of Paris. Indeed, the idea of attending the sick is so closely associated with the Church that even in hospitals which are totally nonreligious the nurses are often called “sister.”
During the Reformation, however, hospitals were generally removed from Church connection or control. The dedicated, free services of the nuns and charitable secular groups were frequently replaced by those of poorly paid workers. Hospitals tended to become filthy, germ-infested buildings where people often died of infection rather than the illness which brought them there. Sick people who could afford it were treated at home. A reactive move toward cleanliness and humanitarianism engendered by the Enlightenment of the eighteenth century was turned back again by the economic and social changes of the Industrial Revolution. The arduous, menial, and sometimes repulsive tasks involved in caring for the sick were certainly no inducement to anyone to go into nursing as a wage-earning activity, especially when industry opened up much more rewarding positions.
John Howard in the eighteenth century had shocked the upper classes with his book Hospitals and Lazarettos. Dorothea Lynde Dix (1802-87), in England and the United States, mounted a personal campaign which eventually achieved the transfer of the mentally ill from brutality and negligence in penal institutions to psychiatric hospitals with more appropriate nursing facilities. Elizabeth Gurney Fry (1780-1845), an English Quaker, organized the Society of Protestant Sisters of Charity in 1840, which attempted to send nurses into the homes of the sick whether poor or rich. Theodor Fliedner (1800-64), a Lutheran minister in Germany, and his wife Frederika were influenced by Fry’s work. In 1835 they established a modest hospital in Kaiserswerth, staffed without pay by the deaconesses of his church, in which the character, health, and education of nurses achieved a high standard.
Others, too, attempted to better the lot of the sick by upgrading hospitals and nurses, but it was Florence Nightingale (1820-1910), with a virtually single-minded sense of mission to make over nursing, who was the motivating force that led toward a truly professional status for nurses. Her interest was not to establish a feminist movement but, rather, to provide more highly skilled and humane treatment of the ill. She nursed her grandmother through a terminal illness, as well as the tenants on her father’s estate, but her first formal exposure to medicine was a three-month course of training at Kaiserswerth, with the deaconesses.
Her experiences in various charitable institutions, during which she wrote critical reports of the needs of hospitals, were finally crowned with the assignment by Sidney Herbert, the secretary of war, to take a contingent of Catholic, Anglican, and secular nurses to Scutari to care for the British wounded in the Crimean War. Miss Nightingale found conditions in the overcrowded military hospitals appalling: miles of dirty beds, no facilities or equipment with which to care for or properly feed the soldiers, and a mortality rate which at times reached over forty percent.
Although most of Miss Nightingale’s hours were spent in organizing, directing, and writing, the soldiers quickly responded to her obvious concern for their welfare. “We lay there by the hundreds; but we could kiss her shadow as it fell and lay our heads on the pillow again content.” Intense opposition to her by local military officials evaporated gradually in the face of ever-increasing casualties and deaths. Her presence and administrative genius during the years 1854 and 1855 saved the hospital from total demoralization. After the war, in renewing her fight to reform the military system, she was responsible for the establishment of the first military medical school and also for many other innovations which made military barracks safer and more sanitary. She also had many rebuffs and disappointments along with her successes. When Secretary of War Sidney Herbert was about to die in 1861, he said to his wife, “Poor Florence, poor Florence, our joint work unfinished.”
In civilian life, hers was also the moving spirit and architectural mind behind the reconstruction of St. Thomas’s Hospital and its founding as an educational institution for nurses, whose first class was graduated in 1861. Miss Nightingale’s energies and writings were in large measure responsible for the transformation of nursing from a low, unpopular, almost casual endeavor into a highly respected, essential part of the healing arts. However, her crusade was not without its personal cost. Worn down by resentment, bickering, and exhausting activity, she had a number of illnesses that probably were largely nervous breakdowns. Her health had remained fragile ever since she contracted a serious febrile illness (probably typhus or typhoid) in the Crimea; nevertheless, she continued to write intensively and to exert considerable influence.
Not all of the opposition to Miss Nightingale was merely personal. Even in the twentieth century, some leaders of nursing believe that the Nightingale focus on bedside care to the virtual exclusion of more scientific methods of teaching and practicing is too narrow. Curiously, she was not convinced that bacteria caused disease and continued to hold the ancient belief in “miasmas” as responsible. But she preached the necessity for cleanliness and saw clearly that the separation of maternity patients from sick people in a hospital was essential to their safe care. Her basic tenets are still cogent: “The art is that of nursing the sick. Please mark, not nursing sickness… This is the reason why nursing proper can only be taught at the patient’s bedside and in the sick room or ward. Lectures and books are but valuable accessories.”
The Red Cross
Since the sixteenth century, many agreements had been mutually arrived at by opposing forces regarding the treatment of prisoners and the wounded, but in practice these rules were rarely followed. During the military action of the combined forces of France and Italy against Austrian troops in 1859, Jean Henri Dunant (1828-1910), a Swiss banker, happened to visit the scene of battle at Solferino in northern Italy after the fighting had ceased. The pitiable condition of the tens of thousands of wounded soldiers still lying unattended on the ground so aroused him that he immediately set about persuading the victorious French commanders to free the captured Austrian military surgeons to help care for the injured of all three nations. Dunant himself pitched in to try to save as many lives as possible. “Tutti fratelli” (all brothers) he kept repeating when local civilians resisted helping the enemy wounded. His book, Un Souvenir de Solverino, published three years later, shocked European leaders into action. Writers such as Victor Hugo, the Goncourt brothers, and Joseph Ernest Renan took up the cry for international humanitarianism. In the second of two international conferences, the Geneva Convention of 1864, sixteen nations signed a treaty establishing the International Red Cross and specifying the regulations that should apply to the treatment of wounded soldiers, which included the recognition that all hospitals, military and civilian, were to be neutral territory; that medical personnel of any country, and their equipment, were to be free from seizure or molestation. The protective insignia was to be a red cross on a white field (the reverse of the Swiss flag). The new spirit passed its first test in 1866 when a group of volunteer civilian students entered the battlefield to care for the Austrian wounded after the battle of Koniggratz. Austria, which had withheld its signature from the original convention, immediately joined.
Dunant lost his fortune—some say because of lavish expenditures in founding the Red Cross—and in 1867 he was bankrupt. After dropping out of sight for about fifteen years, he was discovered in a small home for the aged in Switzerland, poor in resources and unstable of mind. In 1901 when he received the first Nobel Peace Prize (together with Frederic Passy), he donated the entire sum to charity.