The Modern Health Care System
Modern medical practice actually began at the turn of the twentieth century. Before 1900, medicine had little to offer the average citizen, since its resources were mainly physicians, their education, and their little black bags. At this time physicians were in short supply, but for different reasons than exist today. Costs were minimal, demand was small, and many of the services provided by the physician could also be obtained from experienced amateurs residing in the community. The individual’s dwelling was the major site for treatment and recuperation, and relatives and neighbours constituted an able and willing nursing staff.
Midwives delivered babies, and those illnesses not cured by home remedies were left to run their fatal course. Only in the twentieth century did the tremendous explosion in scientific knowledge and technology lead to the development of the American health care system, with the hospital as its focal point and the specialist physician and nurse as its most visible operatives.
In the twentieth century, the advances made in the basic sciences (chemistry, physiology, pharmacology, and so on) began to occur much more rapidly. Discoveries in the physical sciences enabled medical researchers to take giant strides forward. For example, in 1903, William Einthoven devised the first electrocardiograph and measured the electrical changes that occurred during the beating of the heart (Figure 1.3). In the process, Einthoven initiated a new age for both cardiovascular medicine and electrical measurement techniques.
Of all the new discoveries that followed one another like intermediates in a chain reaction, the most significant for clinical medicine was the development of x-rays. When W. K. Roentgen described his “new kinds of rays,” the human body was opened to medical inspection. Initially these x-rays were used in the diagnosis of bone fractures and dislocations. In the United States, x-ray machines brought this “modern technology” to most urban hospitals. In the process, separate departments of radiology were established, and the influence of their activities spread with almost every department of medicine (surgery, gynaecology, and so forth) advancing with the aid of this new tool. By the 1930s, x-ray visualization
FIGURE 1.3 (a) An early electrocardiograph machine and
FIGURE 1.3, cont’d (b) a modern ECG setup. Computer technology and electronics advances have greatly simplified and strengthened the ECG as a diagnosis tool.
of practically all the organ systems of the body was possible by the use of barium salts and a wide variety of radiopaque materials.
The power this technological innovation gave physicians was enormous. The x-ray permitted them to diagnose a wide variety of diseases and injuries accurately. In addition, being within the hospital, it helped trigger the transformation of the hospital from a passive receptacle for the sick poor to an active curative institution for all the citizens of American society.
The introduction of sulphanilamide in the mid-1930s and penicillin in the early 1940s significantly reduced the main danger of hospitalization: cross-infection among patients. With these new drugs in their arsenals, surgeons were able to perform their operations without prohibitive morbidity and mortality due to infection. Also, despite major early twentieth-century advancements in the field of haematology (including blood type differentiation and the use of sodium citrate to prevent clotting), blood banks were not fully developed until the 1930s, when technology provided adequate refrigeration. Until that time, “fresh” donors were bled, and the blood was transfused while it was still warm.
As technology in the United States blossomed, so did the prestige of American medicine. From 1900 to 1929, Nobel Prize winners in physiology or medicine came primarily from Europe, with no American among them. In the period 1930 to 1944, just before the end of World War II, 19 Americans were honoured as Nobel Prize Laureates. During the post-war period (1945–1975), 102 American life scientists earned similar honours, and from 1975 to 2009, the number was 191. Thus, since 1930 a total of 312 American scientists, including some born abroad, have performed research that was significant enough to warrant the distinction of a Nobel Prize. Most of these efforts were made possible by the technology that was available to these clinical scientists.
The employment of the available technology assisted in advancing the development of complex surgical procedures. The Drinker respirator was introduced in 1927, and the first heart-lung bypass was performed in 1939. In the 1940s, cardiac catheterization and angiography (the use of a cannula threaded through an arm vein and into the heart with the injection of radiopaque dye for the x-ray visualization of lung and heart vessels and valves) were developed. Accurate diagnoses of congenital and acquired heart disease (mainly valve disorders due to rheumatic fever) also became possible, and a new era of cardiac and vascular surgery began. The development and implementation of robotic surgery in the first decade of the twenty-first century have even further advanced the capabilities of modern surgeons. Neurosurgery, both peripheral and central, and vascular surgery have seen significant improvements and capabilities with this new technology (Figure 1.4).
Another child of this modern technology, the electron microscope, entered the medical scene in the 1950s and provided significant advances in visualizing relatively small cells. Body scanners using early PET (positron-emission tomography) technology to detect tumors arose from the same science that brought societies reluctantly into the atomic age. These “tumor detectives” used radioactive material and became commonplace in newly established departments of nuclear medicine in all hospitals.
FIGURE 1.4 Changes in the operating room: (a) the surgical scene at the turn of the century, (b) the surgical scene in the late 1920s and early 1930s, and (c) the surgical scene today From J. D. Bronzino, Technology for Patient Care, St. Louis: Mosby, 1977
The impact of these discoveries and many others was profound. The health care system that consisted primarily of the “horse and buggy” physician was gone forever, replaced by the doctor backed by and cantered around the hospital, as medicine began to change to accommodate the new technology.
Following World War II, the evolution of comprehensive care greatly accelerated. The advanced technology that had been developed in the pursuit of military objectives now became available for peaceful applications, with the medical profession benefiting greatly from this rapid surge of technological “finds.” For instance, the realm of electronics came into prominence. The techniques for following enemy ships and planes, as well as providing aviators with information concerning altitude, air speed, and the like, were now used extensively in medicine to follow the subtle electrical behaviour of the fundamental unit of the central nervous system—the neuron—or to monitor the beating heart of a patient.
The Second World War also brought a spark of innovation in the rehabilitation engineering and prosthetics fields. With advances in medical care technologies, more veterans were returning home alive—and disabled. This increase in need, combined with a surge in new materials development in the late 1940s, assisted the growth of assistive technologies during the post-WWII era.
Science and technology have leapfrogged past each other throughout recorded history. Anyone seeking a causal relation between the two was just as likely to find technology the cause and science the effect, with the converse also holding true. As gunnery led to ballistics and the steam engine transformed into thermodynamics, so did powered flight lead to aerodynamics. However, with the advent of electronics this causal relation has been reversed; scientific research is systematically exploited in the pursuit of technical advancement.
Just as World War II sparked an advancement in comprehensive care, the 1960s enjoyed a dramatic electronics revolution, compliments of the first lunar landing. What was considered science fiction in the 1930s and 1940s became reality. Devices continually changed to incorporate the latest innovations, which in many cases became outmoded in a very short period of time. Telemetry devices used to monitor the activity of a patient’s heart freed both the physician and the patient from the wires that previously restricted them to the four walls of the hospital room. Computers, similar to those that controlled the flight plans of the Apollo capsules, now completely inundate our society.
Since the 1970s, medical researchers have put these electronic brains to work performing complex calculations, keeping records (via artificial intelligence), and even controlling the very instrumentation that sustains life. The development of new medical imaging techniques such as computerized tomography (CT) and magnetic resonance imaging (MRI) totally depended on a continually advancing computer technology. New imaging developments include functional MRI (Figure 1.5), a tool capable of illustrating active neural areas by quantifying oxygen consumption and blood flow in the brain. The citations and technological discoveries are so myriad that it is impossible to mention them all.
“Spare parts” surgery is now routine. With the first successful transplantation of a kidney in 1954, the concept of “artificial organs” gained acceptance and officially came into vogue in the medical arena (Figure 1.6). Technology to provide prosthetic devices, such as artificial heart valves and artificial blood vessels, developed. Even an artificial heart program to develop a replacement for a defective or diseased human heart began
FIGURE 1.5 (a) A modern fMRI medical imaging facility and (b) an fMRI scan image.
FIGURE 1.6 Transplantations performed today
With the neural function, resilience, and incredible mechanical strength and endurance of the human heart, complete replacement prosthetics have been only marginally successful. Left ventricular assist devices (LVAD), however, have seen success as a replacement for the “workhorse” region of the heart and are a popular temporary option for those waiting on a full heart transplant. Future directions for heart failure solutions will most likely involve more tissue and cellular level treatments, as opposed to macromechanical systems. These technological innovations have vastly altered surgical organization and utilization, even further enhancing the radical evolution hospitals have undergone from the low-tech institutions of just 100 years ago to the modern advanced medical centres of tomorrow.
In recent years, technology has struck medicine like a thunderbolt. The Human Genome Project was perhaps the most prominent scientific and technological effort of the 1990s. Some of the engineering products vital to the effort included automatic sequencers, robotic liquid handling devices, and software for databasing and sequence assembly (See Figure 1.7). As a result, a major transition occurred, moving biomedical engineering to focus on the cellular and molecular level rather than solely on the organ system level. With the success of the “genome project,” completed in 2003 after a 13-year venture, new vistas have been opened. Stem cell research highlights this chemical and molecular level focus and has been on the
FIGURE 1.7 Stem cell research—potential applications made possible
FIGURE 1.8 Robotic surgery—a new tool in the arsenal of the physician
forefront of controversial scientific research since its conception. While the multitudes of possibilities defy imagination, the moral issues accompanying stem cells have received equal attention in recent years.
Furthermore, advances in nanotechnology, tissue engineering, and artificial organs are clear indications that science fiction will continue to become reality. However, the social and economic consequences of this vast outpouring of information and innovation must be fully understood if this technology is to be exploited effectively and efficiently.
As one gazes into the crystal ball, technology offers great potential for affecting health care practices (Figure 1.8). It can provide health care for individuals in remote rural areas by means of closed-circuit television health clinics with complete communication links to a regional health centre. Development of multiphasic screening systems can provide preventative medicine to the vast majority of our population and restrict hospital admissions to those requiring the diagnostic and treatment facilities housed there. With the creation of a central medical records system, anyone moving or becoming ill away from home can have records made available to the attending physician easily and rapidly. These are just a few of the possibilities that illustrate the potential of technology in creating the type of medical care system that will indeed be accessible, high quality, and reasonably priced for all. (For an extensive review of major events in the evolution of biomedical engineering, see Nebeker, 2002.)