Home Journal Issues Journal Index Blog Search Contact Us Help

Volume 3, Issue 1
Spring 2007:

Healthcare as though People Mattered: The Industrialization of Healthcare

Jeff Kane, MD

Cell 2 Soul. 2007 Spring; 3(1):a14

Forty-four years ago, our clinical instructor led me and three medical classmates into a patient's room. He chatted with the man, who suffered from fluid in the space between his lung and chest wall. The instructor had us look carefully to evaluate the man's breathing, place our hands on his ribs to feel the asymmetry of movement, listen with our stethoscopes, and finally tap our fingers on his chest to sense the different reverberations from air and fluid. As I took my turn tapping, I happened to make eye contact with the man. He smiled at me and joked, "I feel better already, doctor."

Those techniques — observation, auscultation (listening), palpation and percussion — constituted the classic four imperatives of physical diagnosis. They also amounted to intimate contact between doctor and patient.

Physician-author Abraham Verghese recently published a description of his training in diagnostic percussion,1 lamenting that such low-tech skills are endangered species: in fact, budding physicians are no longer tested on them. "The patient in the bed," he wrote, "is merely an icon for the real patient, who exists in the computer." While Verghese certainly wouldn't recommend abandoning modern medical technology, he speculated that it threatens to become the sole vehicle for the physician's understanding of the patient. One complaint I hear regularly runs, "The doctor barely looked at me, he was so busy entering data into his computer."

In shifting attention toward digital healthcare — and away from notoriously subjective, squirmy human beings — we've undoubtedly improved precision and efficiency. We can move more patients through the system and with greater accuracy. But toward what end? We're productive, but are we healthier?

Our current style, favoring scientific objectivity over personal contact, began only a hundred years ago. We can better understand where we are today by knowing where we came from. The enterprises calling themselves medical schools at the end of the Victorian era ranged from the universally admired Johns Hopkins University to storefronts that required of students only cash and token attendance. The Illinois Board of Health reported in 1899 that 179 American and Canadian medical schools favored some sort of scientific approach, 26 taught homeopathy, another 26 were "eclectic," 13 "miscellaneous," and 13 were outright "fraudulent."2

Industrialists Andrew Carnegie and John D. Rockefeller determined to convert horse-and-buggy medical education into a sleek modern vehicle. They were supported in this by people with a variety of motivations, not least of which was commerce's preference for healthy workers. John Topping, an executive with Republic Steel, observed,

"It is good business to conserve life and health, [for thereby] one of the most important items of economy in production is secured."3

In 1910 Frederick T. Gates, the administrator of Rockefeller's philanthropies, wrote his employer of conditions in southern cotton mills:

"…inefficiency in labor is due to the infection by the hookworm which weakens the operatives."4

Gates' ideas about sickness and health were typical of the day. Not a physician but a minister by training, Gates subscribed to the contemporary view of the human body as a sublime machine. He wrote to Rockefeller,

"The body has a network of insulated nerves, like telephone wires, which transmit instantaneous alarms at every point of danger. The body is furnished with a most elaborate police system, with hundreds of police stations to which the criminal elements are carried by the police and jailed…The body has a most complete and elaborate sewer system…"5

To Gates, nearly all disease was

"…caused by living germs…which finding lodgment in the human body, under favorable conditions multiply with enormous rapidity until they interfere with the functions of the organs which they attack and either they or their products poison the fountains of life."6

That being the case, his gold standard for healthcare was scientific research. He wrote,

"…medicine could hardly hope to become a science until medicine should be endowed and qualified men could give themselves to uninterrupted study and investigation, on ample salary, entirely independent of practice…"7

Though there was little opposition to scientizing medical education, a few instructors expressed sidebar caveats. William Osler, professor of medicine at Johns Hopkins and still considered the physician patriarch of North America, practiced "clinical medicine," which he defined as a necessary composite of science and art. Though he revered science, he insisted throughout his career that human beings couldn't be healed by science alone. (His best-known maxim is, "It is more important to know what sort of patient has a disease than what sort of disease the patient has.") Osler cautioned against any change in medical education that might diminish attention to the patient. His warnings went largely unaddressed, and when he left Johns Hopkins in 1904 for Oxford University, he wrote to a colleague he considered too laboratory-oriented, "Now I go, and you have your way."8

The Carnegie Foundation for the Advancement of Teaching commissioned educator Abraham Flexner to critically examine American medical training. Competent and thorough, Flexner visited every medical school in the country. In his final report, issued in 1910, he recommended standardization based on science. Every medical student since then has learned during the first week of freshman year that the Flexner Report marked the birth of modern, scientific medicine, or what we now call "biomedicine."

After Flexner published his report, Dr. Osler, who had since been knighted, continued to predict that inordinate focus on science would likely eclipse concern for the patient. He wrote,

"The danger would be of the evolution throughout the country of a set of clinical prigs, the boundary of whose horizon would be the laboratory, and whose only human interest was research, forgetful of the wider claims of a clinical professor as a trainer of the young, a leader in the multiform activities of the profession, an interpreter of science to his generation, and a counselor in public and in private of the people, in whose interests, after all, the school exists."9

Despite such misgivings, money from Carnegie and Rockefeller — eventually amounting to over $100 million — established teaching chairs and facilities at America's major medical schools in order to realize Flexner's recommendations.10 Unfunded institutions soon found competition difficult, and in a few years half of America's medical schools closed. Scientization of medical education effectively displaced every other style. I inhabit this lineage, as my own instructors received their training from students of the professors Carnegie and Rockefeller funded.

We're in debt to Flexner for advances that ultimately helped devastate infectious disease and other acute disorders. Yet every rose bears a thorn: biomedicine gradually exhibited a darker side, ever-increasing expense unmatched by improvements in health.

Medical costs have yet to find a ceiling. David Callahan, former director of the Hastings Center, a bioethics think tank near New York City, explains the phenomenon this way:

"It's a lot like exploring outer space. The capacity of medicine to provide ever-advanced technologies is endless. No matter how much you spend, you can always spend more."11

As I noted in my essay "Healthcare as though People Mattered, Part One,"12 for all we spend, we're getting progressively less bang for our buck. Further, we have virtually no handle on our national epidemic, pathogenic — or disease-causing — behavior. In sum, our system, biomedicine — industrialized scientific healthcare — is stuck in a cul de sac. It's time to consider a new way of doing things.

Of course, I don't suggest we haul our MRI machines to the landfill, but that we use technology appropriately. Physical tools are effective only in the physical dimension; they can't navigate the human heart. Can we develop a way of evaluating and treating sick people that addresses their suffering, beliefs, hopes and values as fully as we now address their molecules?

Easier said than done. We're not discussing new blood tests here, but a subtler shift, a way of seeing people, in sickness and health, as more than the sum of their meat. Sensing their suffering, in all its quirky subjectivity, along with molecular derangement, we'll behave in entirely different ways.

Fallible beings that we are, eminently sensible ideas don't immediately prevail. Didn't it take the Church almost five hundred years to pardon Galileo for claiming the earth wasn't the center of the universe? We'll need time to accommodate to the wider-angle view in which sickness never occurs in a clinically pristine vacuum, but in every case is inextricable from personal beliefs, emotions and behavior. As Dr. Osler advised, "The good physician treats the disease; the great physician treats the patient who has the disease."

References

1 Verghese, Abraham: Bedside Manners. Texas Monthly, February 2007. pp. 70-80.

2 Brown, E. Richard: Rockefeller Medicine Men: Medicine & Capitalism in America. U. of California Press, 1979. p. 136

3 ibid, p. 115

4 ibid, p. 116

5 ibid p. 120

6 ibid p.120

7 ibid, p. 106

8 ibid, p. 162

9 Bliss, Michael: William Osler: A Life in Medicine. Oxford University Press, 1999, p. 388.

10 Brown, p. 189.

11Wall St. Journal, February 26, 1992.

12 Cell 2 Soul. 2006 Spring; 2(1):a5

Return To Top