In the early years of the twentieth century, Dutch physicist Willem Einthoven adapted to medical practice a newly discovered instrument for measuring minute electric currents. This instrument, the string galvanometer, provided the first practical tool for recording the electrical activity of the human heart in the electrocardiogram (ECG). By the end of World War I, this instrument had so proved its usefulness in clinical medicine that the science of electrocardiography, especially as developed by Thomas Lewis in England, became a cornerstone of clinical cardiology. With the continued application of basic science to medicine, ECG machines were so improved as to permit not only “spot checks” of cardiac function but continuous recordings of disturbed heart rate and rhythm in acutely ill patients. This led directly to the concept of the coronary care unit, in which prompt recognition of electrocardiographic Warnings has significantly reduced the death rate from heart disease.
Studies in the first half of this century of altered pressures and flows in the hearts of animals with natural and induced abnormalities similar to those found in man laid the foundations for modern cardiology and for open heart surgery. In the early 1940s, when Andre Cournand and Dickinson W. Richards in New York provided a safe and practical method to measure pressures and flows in the human heart, the technique of cardiac catheterization was coupled with advances in thoracic surgery to make use of the earlier knowledge of cardiac abnormalities in experimental animals.
Many new drugs have been added to the relatively few effective agents against heart disease discovered in previous centuries. Moreover, newer mechanical and pharmacological methods have often enabled medical and nursing personnel to resuscitate some people stricken with a cardiac catastrophe who heretofore would have been beyond help.
Today we are witnessing yet another phase in the growth of basic knowledge and its application to the cardiac patient. Understanding of the molecular basis for cardiac disease promises considerable benefit in two areas, atherosclerosis (a disease process of complex origins which interrupts the blood supply of the heart and so causes heart attacks) and arrhythmias (disturbances of the orderly beating of the heart which can cause symptoms and even sudden death). Both of these disease processes are now being illuminated by the discoveries of molecular biology (study of the basic chemical structure of living tissues).
Hypertension (High Blood Pressure)
The first person to measure blood pressure was the clergyman Stephen Hales, who in 1733 inserted a hollow tubing into the neck artery of a horse and was astonished to see the blood rise nine feet in a glass column. This was obviously impractical for regular use with humans, and it took another 143 years before an instrument was invented, by Ritter von Basch, which could measure the blood pressure of a human without breaking the skin. This “sphygmomanometer” was the forerunner of the ingeniously simple device introduced by Scipione Riva-Rocci in 1896, a prototype of the more refined instruments of today. Blood pressure was found to equal the pressure in an inflated cuff, compressing the arm, at a point where the pulse could first be felt as the cuff was deflated. This was called the systolic blood pressure because it coincided with the contraction of the heart. N. S. Korotkoff, in 1905, by using a stethoscope to monitor the pulse not only achieved a more accurate reading but also discovered that the pulse sound disappeared, as the cuff pressure declined, at a point roughly in consonance with the expanding of the heart (diastole)—thus establishing the diastolic pressure.
Experimental investigations by Harry Goldblatt, R. Tigerstedt, P. G. Bergman, and others first showed that blood pressure could be affected by a substance elaborated in the kidney. Subsequent intensive studies by many researchers, including G. W. Pickering, M. Prinzmetal, I. Page, F. Volhard, D. van Slyke, and Braun-Menendez, have revealed the existence of a complex enzymatic, hormonal mechanism of blood pressure regulation in which the kidneys, adrenals, and nervous system all play an important part.
Numerous treatments have successfully controlled excessively high blood pressure: various diets; operations on the adrenals, kidneys, arteries, and sympathetic nervous system; and many chemical agents. Certain specific causes of high blood pressure have been discovered, such as tumors of the adrenal gland and narrowing of the arteries leading to the kidneys, but causation of the majority of instances of hypertension are yet to be explained.
For a long time the heart was considered outside the limits of surgery. As late as 1896, the usually perceptive historian Stephen Paget wrote, “Surgery of the heart has probably reached the limits set by Nature to all surgery; no new method, and no new discovery can overcome the natural difficulties that attend a wound of the heart.” Ironically, that same year, Ludwig Rehn successfully repaired a laceration of the heart, and the era of cardiac surgery began.
Later developments in operations on the heart stemmed from several innovations. First, methods were found to enter the chest safely under anesthesia (resulting from Sauerbruch’s experiences with his special chamber in 1904) and to introduce anesthetic gases under pressure through intratracheal tubes to keep the lungs inflated. Another development came out of early attempts at sewing together severed arteries. After numerous failures by many investigators, Robert Gross in 1948 succeeded in bridging severed ends of the large main artery (aorta) by tissue and artificial grafts.
In the heart itself, early attempts to correct valve abnormalities were for the most part unsuccessful. The pioneer accomplishment in 1923 by E. Cutler and S. A. Levine of widening a scarred valve inside the heart was followed by numerous failures and only a rare success. Finally, in the 1940s, C. Bailey and D. Haven in the U.S. and H. Sellors and R. Brock in England regularly were able to achieve satisfactory results. When C. Hufnagel in 1952 implanted a synthetic valve, cardiac surgery had reached general acceptance.
An innovation essential to the future of medical and surgical treatment of heart disease came about as Werner Forssmann, a young intern in a German hospital, was trying to work out a technique for emergency injection of drugs directly into the heart. In 1929, while standing behind a fluoroscopic screen and looking into a mirror, he threaded a thin catheter inserted into his own arm vein through the venous channels and into his heart. Daring as this procedure was, it actually had been performed fifteen years earlier by Bleichroeder, Unger, and Loeb without the aid of X-rays. After intracardiac catheterization came into use, investigators were able to perform detailed, quantitative physiological studies, and A. Cournand and D. W. Richards, Jr., were corecipients with Forssmann of a Nobel prize in 1956. Cardiac catheterization also permitted X-ray visualization of the interior of the heart and blood vessels by the injection of radiopaque substances. Another noteworthy innovation was the use of pacemakers to keep the heart beating even though scarring interfered with the transmission of contracting impulses throughout the heart muscle (heart block).
Until mid-century, various remarkable reparative operations on congenital defects and malarrangements of the large blood vessels were performed inside the heart entirely by feel since the heart had to be kept pumping blood. These blind manipulations were based on anatomical dissections by Rokitansky in the nineteenth century and Maude Abbott in the twentieth, who had thoroughly classified cardiac defects. In 1939 Robert Gross reported the first successful cure of a congenital heart anomaly. Acting on suggestions by E. Park and Helen Taussig, Alfred Blalock devised procedures which rearranged the abnormal arterial connections of newborn “blue babies,” thus dramatically saving these heretofore doomed children and prompting corrective operations on other congenital defects as well.
However, to permit open, careful procedures on the heart under direct vision, a means was required to keep oxygenated blood circulating, especially to the brain, without action by the heart. After nineteen years of intensive experiment, John Gibbon, his wife, and others constructed a heart-lung machine which accomplished this and enabled him, in 1953, to successfully close a defect inside the heart under direct visualization. Variations of the heart-lung machine subsequently have been used in an increasing number of cardiac operations and to support patients suffering from acute heart attacks. The development of the heart-lung machine and the intracardiac catheter have also made possible recent operations devised to bypass blocked coronary arteries which nourish the heart muscle.
It was to be expected that eventually attempts would be made to replace a hopelessly inadequate heart with a living transplant. Such transplantations in animals, along with intensive study of the physiologic consequences, had been reported in the 1950s and early 1960s, but in 1967 Christiaan Barnard in South Africa actually performed the first cardiac transplantation on a human. During the following five years, well over a hundred heart transplants were tried, with some recipients surviving several years, but as of 1977 there were virtually no long-term survivors and the procedure had been almost abandoned, principally because of an inability to control rejection by the body of the newly transplanted heart.
The use of mechanical devices either to support the damaged heart or to replace it completely (for short periods of time) were first reported in animals by T. Akutsu and W. Kolff in 1958 and later by De Bakey and others. D. Cooley and coworkers in 1969 employed an artificial heart for the first time to keep a patient alive for about two-and-a-half days while awaiting the transplantation of a living heart.
Whether the future course of cardiac surgery will be in the direction of living heart transplantation or mechanical replacement depends in large measure on increased understanding of the immunologic mechanism and on advances in bioengineering.
Surgery of the Arteries
Modern vascular surgery began near the end of the last century when Matas of New Orleans developed the first operation directly on an arterial aneurysm. About the same time, the famous Russian physiologist Eck performed a vascular connection between the portal vein and the vena cava, two principal venous trunks in the abdomen. After the turn of the century, Carrel developed a scientific method of connecting the ends of blood vessels (applicable to either small or large arteries and veins) which achieved a watertight suture without narrowing the caliber of the vessel. Nevertheless, a long time elapsed before the technique came into clinical application.
The success of early operations on the heart and great vessels stimulated further research throughout the world on surgery of the blood vessels. Robert Gross first used segments of arteries harvested from people who died in accidents to create a shunt between systemic and pulmonary circulations. Dos Santos of Lisbon attempted to recanalize an occluded artery by removal of the thrombus and the sclerotic plaques on its inner lining. Goyanes of Spain was the first to use a segment of popliteal vein to restore the continuity of the popliteal artery (a large artery along the back of the leg) after excising an aneurysm. In 1948, Kunlin of Paris used a segment of the saphenous vein (a long vein in the leg) to bypass a blockage of the main artery in the extremity. In 1950 Oudot of France performed the earliest successful resection of an occluded bifurcation of the aorta (the main artery in the abdomen), bridging the gap with a homograft (another person’s artery). The first effective resection of an abdominal aneurysm of the aorta and insertion of a homograft was reported by Dubost of Paris in 1952.
Nevertheless, homografts were difficult to procure and they suffered anatomical alterations over a period of time, so the need for a synthetic graft material became evident. Glass and aluminum tubes had been tried by Carrel and Tuffier in World War I, and vitallium, polyethylene, and siliconized rubber were also used later. However, the only clearly successful and practical graft material appeared to be a synthetic fabric; the first to be used experimentally on dogs was porous vinyon and vinyon “N” cloth, which Voorhees employed in humans in 1953. Later, many other materials were created, but the search for an ideal artificial graft continues.
In 1921, Nylen used a monocular microscope in the performance of an operation on the ear for deafness. A year later Holmgren described the successful employment of a binocular magnifying instrument which became the prototype for all subsequent microsurgical apparatuses in various fields, as for instance by Donaghy and by Jacobson and Suarez in 1964, who reported on microsurgical techniques for joining small blood vessels. In recent decades limbs have been saved from gangrene due to arteriosclerosis of the arteries by procedures which bypass the blocked vessels. Furthermore, through teamwork among several disciplines severed limbs have sometimes been reconnected with survival of the limb and eventual restoration of much of the function.
A whole field of science rarely develops overnight, but it might well be said that modern gastroenterology was born on the morning of June 6, 1822, when Dr. William Beaumont treated the severe wound of Alexis St. Martin which left his stomach permanently exposed through his abdominal wall. The classic series of experiments conducted by Beaumont proved the presence of hydrochloric acid in the gastric juice, established the intimate relationship between emotional state and gastric secretion and digestion, delineated the details of gastric motor activity, and in many other ways opened the frontiers of physiological research in gastroenterology.
In 1902, the discovery by William Bayliss and Ernest Starling in London that a chemical substance from intestinal tissue (which they called “secretin”) was capable of stimulating secretion from the pancreatic gland revolutionized biologic science by proving that organ function could be regulated by chemicals as well as by nerves. Thus the field of endocrinology was born as an offspring of gastroenterology. In 1905, William Hardy’s newly coined word, “hormone” (Greek for “I arouse activity”), was first applied in print to the whole class of other postulated chemical messengers like secretin. At the very same time, John Edkins was demonstrating the presence of a stomach-acid-stimulating chemical messenger in the lower half of the stomach of dogs, which he named “gastrin.”
Other hormones influencing gastrointestinal function were reported in subsequent years, and today there are over two dozen established and presumed gastrointestinal hormones—with more undoubtedly yet to come. Among the many other noteworthy contributions to understanding the mechanisms involved in diseases of the alimentary tract was the introduction by Dragstedt and Owens of the operation which cuts the vagus nerves for the treatment of peptic ulcer.
In 1965, in a genetics laboratory in Philadelphia, B. S. Blumberg with his colleagues discovered by chance a virus antigen which became the key to the mystery of serum hepatitis (a liver ailment following transfusion), for which he received a Nobel prize in 1976.
In addition to the X-ray techniques for visualizing the alimentary tract, other important technical innovations have included I. J. Wood’s gastric suction tube in 1949, an intestinal biopsy tube by Margot Shiner in 1958, the liver biopsy needle by Menghini, and numerous endoscopes which permit visual inspection, tissue sampling, and even surgical manipulation of the esophagus, stomach, duodenum, colon, abdominal cavity, and the ducts of the pancreas and bile system.
One of the outstanding advances in medicine of the twentieth century was the discovery that a correctable nutritional deficiency, associated with the absence of acid in the stomach, was responsible for the fatal disease pernicious anemia. After many years of painstaking animal experiments George Richards Minot reported the cure of a human with the disease by the ingestion of a large quantity of liver. The combined and separate efforts of William Parry Murphy, George Whipple, Edwin Cohn, William Castle, and others were responsible for finally discovering, isolating, and proving that the missing factor was a vitamin (B 12). Minot, Murphy, and Whipple received a Nobel prize in 1934.
Perhaps the most famous contribution to endocrinology was the isolation of insulin by Frederick Banting and Charles Best in 1921. Over subsequent decades, various types of injectable long-acting insulins were prepared for the treatment of diabetes, and there were other chemical agents synthesized which could be taken by mouth to lower blood sugar.
Clarification by animal experiments of the function of the parathyroid glands, located in the neck, and their relationship to calcium in the blood and bones permitted the effects of parathyroid tumors to be understood. Collip and Hanson in 1925 and Copp in 1962 further illuminated the physiological role of the hormones secreted by these glands.
The discovery in the 1920s of the sex-organ-stimulating principles of the pituitary gland (at the base of the brain) was followed by the introduction in 1928 by Ascheim and Zondek of the first usable pregnancy test (“The A-Z test”) based on hormones in blood from the placenta. Soon after, the chemical structures of female sex hormones were delineated and their relationship to the menstrual cycle explained. Based on this understanding, oral contraceptive drugs were later introduced in the 1950s by Gregory Pincus.
The discovery just before 1900 that substances in the inner layer of the adrenal glands (located just above the kidneys) raised blood pressure enabled physicians to recognize the effects of adrenal tumors. Studies on the chemicals secreted by the adrenals, which control the tension of artery walls, led to a clearer understanding of the mechanism of hypertension. The importance of the outer layer of the adrenals to the maintenance of life was only fully appreciated in the 1920s and 1930s, although Addison in the nineteenth century had described the disastrous effects of adrenal disease. Laboratory experiments coupled with clinical observations revealed the interrelationships of the adrenals with the pituitary and the sex organs in health and illness. The isolation and synthesis of hormones from these organs made them available for the treatment of many disease processes (for instance, arthritis, inflammatory conditions, and deficiency states).
The probable functions of the thyroid were learned in 1891 when extracts of the gland helped people with sluggish behavior, increased weight, hair loss, and other symptoms of deficient thyroid activity. In 1910 David Marine and others indicated that goiter (enlargement of the thyroid) represented an iodine deficiency which could be prevented by taking iodine. In the 1940s when investigators succeeded in crystallizing the hormones secreted by the thyroid, the pathologic conditions of the thyroid gland were better understood. Overactive secretions (hyperthyroidism) could be managed by one of several methods: operative removal of a sizable portion of the gland (performed less frequently now); drugs to nullify the effects; and radioactive iodine to diminish thyroid activity.
As more information accumulated, the tiny pituitary gland was seen to be a central control station for virtually all of the endocrine glands. Acting on the sex organs, adrenals, thyroid, and perhaps also directly on other tissues, the pituitary appears to affect many processes, including the growth of bones. Hormones thus represent an important group of chemical messengers by which different organs and tissue systems affect each other.
At the turn of the century, many of the basic tools of ophthalmology already were known: the slit-lamp microscope to examine with magnification the front structures of the living eye, the ophthalmoscope to see the interior, and the tonometer to measure pressure and thus study glaucoma. The gross and microscopic structures of the eye in health and in disease were well known, but complete details of how the eye functions had not yet been worked out. In terms of treatment, there was a crude operation for removing senile cataracts and pilocarpine drops and some simple procedures for glaucoma. Eyeglasses had been known to Europe since the thirteenth century. However, detached retina was untreatable, and there were no antibiotics or effective drugs against infections and inflammations.
Although special eye doctors had existed since antiquity, there were very few physicians who had the training and equipment to devote themselves entirely to ophthalmology. Treatment of most eye diseases was in the hands of general practitioners, or eye, ear, nose, and throat doctors. Spectacles were often sold over-the-counter or by itinerant spectacle-peddlers after a minimal examination or none at all.
Advances in ophthalmology may be illustrated by the evolution of cataract extraction. By 1900 a reasonably satisfactory operation had replaced the ancient procedure of “couching,” whereby the cataract was simply pushed down out of the line of sight. In the early nineteenth century, removal of the lens was made considerably more practical by the advent of local anesthesia (introduced by an ophthalmologist, Carl Koller, in the late nineteenth century) and by sterile techniques. Over the years innumerable small improvements in instruments, suture materials and needles, and operative techniques have made the operation safer and more effective.
In a similar way, the treatment of glaucoma, cross-eyes, and most other major and minor conditions has gradually improved. Innovations also have occurred in eyeglasses, contact lenses, and medications, but perhaps the most dramatic advances have been in antibiotics, operations for detached retina, and the use of corticosteroids, which have transformed some hitherto hopeless diseases into treatable conditions.
Nevertheless, the delivery of the best eye care is not available all over the world. Of the four leading causes of blindness, only one is a problem in knowledge: river blindness, or onchocerciasis, a type of parasitic infestation carried by flies for which neither prevention nor treatment is satisfactory. On the other hand, trachoma is easily preventable and treatable, yet it remains a worldwide scourge simply because the remedies are not always available where they are needed. Similarly, cataracts are also easy to cure, but only if the patient, a trained surgeon, and an operating room can be brought together. A fourth cause of blindness, malnutrition, which promotes degeneration of the eye structures, is also a sociological problem.
We are almost helpless in the face of such degenerative diseases as diabetic retinopathy and senile macular degeneration, a cause of much blindness, and, similarly, we are just beginning to be able to handle some of the inherited diseases. Discoveries in chemical analysis of genetic defects may teach us more about retinitis pigmentosa, retino blastoma, and numerous other causes of disability which are transmitted in the genes.
In the ear, nose, and throat specialty (otorhinolaryngology), operations on the ear for infection and for deafness have been among the outstanding contributions of this century. For generations, the disfiguring scar of a mastoidectomy had been a common sight. However, when the antimicrobial agents came into use in the 1940s and 1950s, most ear infections could be managed without major operations, and so mastoidectomy has declined in frequency.
The surgical treatment of otosclerosis, a major cause of deafness, gained its principal impetus from reports by Julius Lempert in 1938 of a fenestration operation which placed the eardrum directly against the labyrinth (an inner ear structure), permitting sound to bypass the usual entrance blocked because of otosclerotic disease. Although Holmgren’s introduction in 1923 of the operating microscope (later taken up by other surgical disciplines) had been closely followed by successful fenestration operations in stages, by Sourdille, it was not until Lempert’s achievements that otologic surgeons began to take up the operation in earnest.
Samuel Rosen, while performing a fenestration procedure in 1952, noted that freeing the fixed stapes bone immediately improved the patient’s hearing. Subsequently, he developed the now popular and effective stapes mobilization operation. Actually, Boucheron in 1888 had reported on sixty operated cases, Miot in 1890 had written up two hundred instances of stapes mobilization, and Faraci in 1899 had summarized his experiences in thirty cases. However, except for an occasional report thereafter, the technique had been forgotten until Rosen’s chance rediscovery.
Numerous attempts since 1957 to alleviate total deafness due to nerve degeneration, by implanting electrical devices into the inner ear, have been only partially successful. On the other hand, external hearing aids of great variety and ingeniousness continue to be developed.
In 1741 Nicholas Andre, Professor of Medicine at the University of Paris, published a book on the prevention and correction of musculoskeletal deformities in children. For its title he created the word “orthopaedic” from two Greek roots, orthos (straight) and paideia (rearing of children), and his illustration of a staff used to straighten a growing sapling has become the international insignia of orthopaedic societies.
For decades orthopaedists have been physicians or surgeons interested in musculoskeletal deformities and diseases. These were chiefly scoliosis (curvature of the spine), tuberculosis and other infections of the bones and joints, paralysis due to poliomyelitis, and congenital defects such as dislocations of the hip, club foot, and Erb’s palsy (birth paralysis of the arm). Eventually it also included fractures, dislocations, and other injuries to the spine and extremities.
Until the twentieth century, most orthopaedic treatment was mechanical, with braces, plaster casts, and manipulation, but some simple operations such as osteotomy (correcting deformed bones by cutting them) and uncomplicated tendon transplants were also done. In 1908, Erich Lexer reported apparently brilliant success with transplantation of total knee joints from one person to another, but this procedure was never taken up by other orthopaedists—possibly because late results did not bear out the early promise. In 1911 Russell Hibbs of New York revolutionized the treatment of scoliosis and spinal tuberculosis by devising a spine fusion operation, which continues to be improved upon and modified.
Fractures of the hip were considered untreatable, and until the present century little was done for them. In the 1930s Smith-Petersen of Boston developed a special nail that could be inserted to hold the fracture fragments together. Shortly thereafter, metal substitutes were devised for the disunited head of the femur. These devices and procedures were improved, especially through the brilliant innovations of John Charnley of England, to the point where a total joint including the socket can be replaced not only for injury but also for some forms of arthritis. At present the procedure appears to be working well for the hip, and something similar is being developed for the knee, ankle, elbow, fingers, and other joints. Joint replacement may become the most important contribution of orthopaedics in this century.
The displaced or “slipped” intervertebral disc, an elastic substance which forms a cushion between each of the vertebrae in the spinal column, was recognized as a common cause of low back ache and sciatica about 1911. A herniated disc was first removed by Mixter and Barr of Boston in 1934. Although this operation has been evaluated with variable enthusiasm, it remains another major contribution of orthopaedics to medicine.
The neurosciences have inherited centuries of accumulated observations—from the experiences of earliest societies with trauma and the trephining of skulls to the pioneering advances of the nineteenth century. One may especially mention the experimental work of Gustav Fritsch and Eduard Hitzig in 1870, which showed clearly that sensory and motor functions could be localized in the cortex of the brain. William Gowers, Hughlings Jackson, and S. Weir Mitchell were at the forefront in establishing clinical methods for evaluating neural disorders.
The structure and function of the nerve cells and fibers had also been clarified by the significant investigations of Camillo Golgi and Santiago Ramon y Cajal before the twentieth century had finished its first decade. The methods of tissue culture which Ross Harrison devised in 1907 to determine how nerve fibers regenerated after injury became an essential tool for research in other fields, among them vascular surgery and virology. Charles Sherrington and Edgar Adrian received a Nobel prize in 1932 for their investigations on reflexes, nerve impulses, and the mechanism of sensation. In recent decades, investigators have shown that while some electrical principles may operate in the conduction of nerve impulses, chemical transmitters, linkages between cells, and feedback mechanisms are integral parts of the functioning of the nervous system and sense organs. For their discoveries on the physiology of vision, George Wald and Ragnar Granit were awarded a Nobel prize in 1967.
Information derived from many disciplines concerning the detailed makeup and activities of cells has also been utilized to help understand and treat neural dysfunction with drugs and operations. Numerous investigators are illuminating hidden recesses of mental function through studies on consciousness, speech, memory, and sleep.
Surgery of the nervous system owes much to the pioneer work in the nineteenth century of Victor Horsley, often called “Father of Neurosurgery.” The first person successfully to remove a tumor of the neural substance in the spinal cord, in 1887, Horsley also performed many significant animal experiments and successful cranial operations on humans. However, neurosurgery received its greatest impetus from Harvey Cushing, who was responsible for major advances in surgery of the pituitary gland, management of increased intracranial pressure, and treatment of brain tumors. Not the least of his contributions was the training of outstanding neurosurgeons from all over the world. Walter Dandy, one of his brilliant pupils and later his personal antagonist, advanced neurosurgery through innovations in surgical technique and diagnostic procedures.
In the last twenty years, neurosurgical operations have been extended not only to the removal of growths and aneurysms of the nervous system but also to the relief of pain by severing nerve tracts, the palliation of tremors and abnormal behavior, and the beneficial modification of hormonal mechanisms in the treatment of cancer.
ASSOCIATED HEALING PROFESSIONS
In 1840, the world’s first dental school was founded in Baltimore, Maryland. A few years later, dental schools were established in Europe. As in medicine, the dental college course was gradually lengthened from a few months to four years in addition to the required minimum predental training. By the mid-nineteenth century, licenses were being issued by various states. In England, the first licenses were issued in 1859, but the dental profession remained under medical control. In Europe there is still some conflict as to whether dentistry is a specialty of medicine or a profession unto itself.
Prior to the twentieth century, dentistry concerned itself primarily with dental caries (cavities), malposition of teeth, and diseases of the supporting tissues of the teeth. Prevention was largely neglected until techniques were developed in this century to conserve and restore teeth and to prevent and arrest disease. In addition, fluoridization of the water supplies to prevent dental caries has been one of the most successful public health measures ever advocated. Dentistry has now developed the following subspecialties: oral surgery, periodontics (diseases of the supporting structures), pedodontics (children’s dentistry), prosthodontics (replacement of missing teeth), orthodontics (correction of malposed teeth), public health, oral pathology, and endodontics (root canal therapy).
The materials used to fill cavities in the teeth during the Middle Ages were waxes and resins, followed by gold leaf and lead in the mid-fifteenth century. Today’s amalgam fillings, essentially a mixture of silver and mercury, were developed in the early nineteenth century, but other materials have also been introduced. To restore lost or damaged teeth, newly developed synthetic resins and acrylics were adapted to dentistry, and the use of fused porcelain for false teeth became a highly sophisticated art. Other inert materials were also introduced as implants in the jawbones to substitute for lost structures. Some temporary success has even been achieved in the reimplantation of teeth dislodged from their sockets. Experiments have also succeeded in transplanting a tooth from one part of the jaws to another more useful site.
High-speed drilling with water-cooling and the judicious use of local and general anesthetics have for the most part controlled the pain experienced and feared for centuries by patients in the dental chair. The advances in antibiotics, X-ray techniques, and other disciplines in medicine have been taken up by dentistry just as the contribution of general anesthesia by American dentists became the property of medicine.
Since the days of Florence Nightingale, the emphases in training and practice have shifted from purely clinical bedside nursing to include more academic and supervisory subjects. Hospital-based nursing schools, while still prominent in Europe, have become fewer in the U.S., where education has moved more and more to academic institutions. Advanced degrees beyond R.N. (Registered Nurse) have extended into many special fields, such as maternal and child care, geristrics, psychiatry, cardiovascular illness, medical and surgical subspecialties, cancer care, and public health administration.
The earliest activities in medical social service in the U.S. were those performed by nurses who tried to involve patients in occupational therapy. Under medical direction, nurses also gave physical therapy treatments. A visiting-nurse training school founded by Lillian Wald in 1893 was the forerunner of those institutions which train the public health nurse, who goes into homes to oversee the sick and disabled and to educate families in the fundamentals of health and hygiene.
In Britain and other countries, nurses have often had considerable responsibility for the management of patients, which sometimes included duties that in the U.S. devolved upon junior physicians. Recently, however, in intensive care units of hospitals, the specialized nursing staffs have been involved in clinical decisions as well as in administrative planning. Moreover, nurse-clinicians as independent practitioners have begun to appear.
While the professional and economic status of nurses has risen, their intensive, experiential training in individual care has become less based on the bedside. Some deplore this lessening of personal patient care, whereas others point to the advantages of improving the nurse’s expertise and usefulness in overall care.
Regular physicians and specialists in animal illnesses have cared for the flocks and herds over many centuries. However, it was not until the nineteenth and twentieth centuries that the veterinarian became a fully recognized, certified professional with clear-cut education and training accompanied by sophisticated methodology.
Veterinary colleges had been established in France, England, and Scotland in the eighteenth century, but it was not until 1875 that the first veterinary college was founded in America, by the Frenchman Alexandre Liautard, in New York. In 1863, Liautard and Robert Jennings had brought together representatives from seven states to launch the first Veterinary Medical Association. Two years earlier, a veterinary school had opened in Ontario, Canada. Since then, schools of veterinary medicine have increased in number all over the world (although there are still only nineteen in the U.S.), journals have multiplied, and specializations within the profession have divided and subdivided.
There are many examples of far-reaching additions to medical knowledge by veterinarians, of which a few may be cited. Bernard Bang of Denmark, who described a blood disease in fowl, leukosis, and explained the causation of an abortion-producing illness of cattle, also was responsible for devising a test using the tuberculin which Robert Koch had developed. The entire field of virology was opened through the discovery by Klaus Loeffler and Paul Frosch in 1898 that a specific infection (foot-and-mouth disease of cattle) was caused by a filterable virus. In the first catheterizations of the living heart, the veterinary J. B. A. Chauveau was Claude Bernard’s coworker and also a pioneer in producing attenuated viruses for immunization. F. L. Kilborne, together with the physician Theobald Smith, presented in 1889 the first proof of an insect acting as a vector in disease. The achievement of the chemotherapy of tuberculosis was linked to the work of William Feldman, who also contributed to the treatment of leprosy with sulfone drugs. Antitetanus immunizations and diphtheria toxoid inoculations arose out of the labors of veterinarian Gaston Ramon. Recently a team of veterinary and medical investigators at the Wistar Institute in Philadelphia obtained a simple vaccine against rabies which is ninety-nine percent effective in humans even after a rabid bite. The hypodermic syringe was derived from Tabourin’s first crude instrument, and spinal anesthesia was first used in veterinary medicine. Daniel Salmon may be particularly mentioned for making numerous contributions to the control of human disease. Together with Theobald Smith, he established the efficacy of killed microorganisms for making vaccines. Among other fundamental studies, he demonstrated the transmissibility to humans of tuberculosis in cattle. His researches on the paratyphoid bacterium led to its designation with his name as Salmonella.
Diseases of animals have always been significant to human health. Approximately 150 infections (the zoonoses) are transmissible from animals to humans. For instance, a severe erysipelas infection, glanders, originally thought to be confined to horses, was found to be transferable to humans. Similarly, psittacosis (a pulmonary disease of birds), equine encephalitis, and botulism have proved to be of serious consequence to humans as well as animals. Hookworm, a parasite formerly affecting large segments of the poorer population in the South, Egypt, and other countries, penetrates the skin of those who walk barefoot on contaminated ground. Maurice Hall and Jacob Schillinger rid large areas of the parasite by using chemicals on dogs that carried the organism. A malignant tumor of fowl (chicken sarcoma) was discovered by Peyton Rous to be caused by a virus, but the importance of this finding to human cancer is still debated. However, the observation has set many investigators to searching for a linkage between viruses and cancer. In recent years, other links to malignant tumors have been discovered, as for instance William Hardy’s demonstration that leukemia in cats is caused and transmitted by a virus.
The lessons learned from animal diseases and the experiences of veterinarians in controlling their spread have had considerable impact on public health measures aimed at containing a number of human epidemic illnesses, such as yellow fever, plague, malaria, and cholera. In addition, the danger of mercury poisoning was first detected in cats fed fish from waters contaminated by the metal. Breeders and veterinarians, in noting endocrine derangements in minks who ate domestic chickens, called attention to the potential hazard of administering hormones to fowl raised for human consumption. Recognizing the contributions of veterinarians, health departments have increasingly used their services.
Veterinarians have also played an important part in supervising the activities of research laboratories. Experiments on animal models have been essential to advances in human medicine. The discovery and manufacture of insulin is a prime example. The antipoliomyelitis vaccine is another. Throughout all fields of medicine, new mechanical procedures and operative innovations have usually required testing first on animals. There have been vigorous confrontations between research laboratories and antivivisection groups, but a measure of accommodation in recent decades has led to acknowledgment by one side of the essential role of experiments and agreement by the other to stricter regulation of the care of animals in research institutions.
Veterinary medicine has been important to the quantity and quality of food, the prevention of the spread of disease in all living forms, the application to humans of information derived from animal care, and the maintenance of the health of pets whose presence contributes to the enjoyment and psychological well-being of humans. It is in the capacity of private physician to horses, cattle, and small household animals that most practitioners are to be found. Increasingly, the veterinarian has come to be regarded as a teammate of the physician in the study, care, and treatment of living creatures.
What lies ahead? Soon this century’s ideas and activities will be reviewed by the next century’s historians and scientists—with occasional admiration, we hope; with amused tolerance, perhaps; with astonished dismay, in all likelihood. But we need feel no embarrassment, because each period will take its turn being evaluated by its successors. We enter the future facing backward, seeing only the road on which we have just traveled. We would do well to view today’s medicine as merely a marker between the past and future.