Because the medical establishment was quick to label alternative approaches to the Western way as hocus-pocus and quackery, chiropractors, homeopaths, and practitioners from other schools of thought (often women) were pushed out of the mainstream medical arena. The Pure Food and Drug Act, regulating the prescription of medicinals, was passed by 1906. With the release of the Flexner Report, Medical Education in the United States and Canada, in 1910, competing forms of medicine were virtually obliterated. Abraham Flexner, a U.S. educator and founder of the Institute for Advanced Study in Princeton, New Jersey, developed the report to set standards for American medical school education. In some instances the report helped regulate education, which in the 1800s had been far from adequate, ensuring that doctors were qualified to care for the ill and infirm.
But the Flexner Report certainly had its shortcomings: it was rigid, leaving little room for innovation and flexibility in medical education, and it never addressed the patient-doctor relationship. At the time that the Flexner Report was written, this relationship was taken for granted. Of course, the patient would always be considered above all else, Flexner reasoned. So he did not write that concept into the standards. As generations of doctors learned how to treat human bodies, that's all they learned. The importance of listening and staying connected to the patient was lost somewhere between anatomy and pharmacy. And even Flexner, who was not a physician, was eventually displeased with the rigid standards that he had originally helped create.
Within a few short years following Flexner's curricula guide, alternative medical schools—schools of homeopathy and osteopathy to name two—were left with little more than fringe status, and most were forced to close their doors. (Homeopathy is a school of treatment involving the administration of minute doses of remedies to increase the symptoms a patient is experiencing in an effort to spur the body's powers to restore harmony. Osteopathy is the science of manipulating the musculoskeletal system to restore health.) Biomedicine was the standard. Doctors and the powerful lobby of the AMA successfully kept the "charlatans," as they called them, out of the arena for nearly sixty years. It wasn't until consumer confidence in conventional medicine started to wane three decades ago that an opening appeared for alternative paths of healing. Many of the changes began to emerge in the sixties. Reports on the serious side effects of commonly used drugs like antibiotics started chipping away at consumer confidence. Coupled with a resurgence of more virulent strains of tuberculosis and deadly bacterial species like strep-A (the flesh-eater), not to mention new diseases like AIDS and Alzheimer's and cancer that traditional treatments could not cure, medical shoppers began looking for more. They wanted to find something to help them feel better, not necessarily get rid of the disease, just make them feel better. They wanted a more satisfying way of life.
MEDICINE AMERICAN-STYLE
Western Miracles and the Deification of Doctors
Many of this country's medical breakthroughs have had some basis in ancient Eastern wisdom. Drawing on an eleventh-century Chinese practice of using a powder derived from aging smallpox scabs to prevent disease, English country doctor Edward Jenner further evolved this Asian discovery into a vaccine for smallpox. Jenner scratched eight-year-old James Phipps's arm with the cowpox virus. It was this simple experiment that, several generations later, led to the eradication of smallpox in America and most of the world.7 Although it was the Chinese who first used this technique, Jenner was named the Father of Vaccinology.
By the first half of this century, new medical discoveries had dramatically altered the face of Western medicine. Soaring past ancient horizons, medicine's innovations unveiled frontiers never before explored by even the most adventurous of healers. British bacteriologist Alexander Fleming was one such pathfinder. Returning from vacation in 1928, the pioneering scientist was cleaning up his laboratory and discarding used culture plates, when he observed something new: a fungus that had been flourishing on the culture plates in his absence was destroying the fringes of the deadly staphylococcus bacteria that had been smeared on the plates. His observations, although not fully appreciated and developed into penicillin until the 1940s, gave rise to a new era of treatment.8
At the same time government, academia, medical science, and the private sector, namely, drug companies with big dollars, formed previously unheard of alliances. Vast sums of government dollars were poured into medical research at medical schools and universities, and this powerful partnership9 began a miraculous wave of invention that launched Western medical care into an age of wondrous findings and technological advances. Smallpox and polio were virtually eradicated in the Western hemisphere; human eggs could be fertilized in test tubes instead of in the mother's womb; surgery and medical imaging, enhanced by computers and robotics, became commonplace; body organs could be transplanted from dead patients into living ones, from pigs to humans; and through innovations in communication, and remote surgery techniques, surgeons and patients could remain on opposite sides of the country during surgical procedures.
As we approach the end of this millennium, technology is advancing more rapidly than even such sci-fi legends as Robert Heinlein or Isaac Asimov could predict. We're entering the twenty-first century with people living well into their eighties and nineties; forty- and fifty-year-old women (an age that once marked the end of life) are giving birth; aging, withered bodies are being sustained by transplants, respirators, and feeding tubes, while women and men voluntarily hook themselves up to the Kevorkian death machine when they can no longer tolerate the pain of living with debilitating diseases.
It's no wonder that we've been misguided into thinking that our doctors, our external healers, are deities capable of performing the greatest of miracles. We don't just pray for such miracles; we expect them. When a doctor fails to meet our ever increasing demands for youth and immortality, we sue. We scream malpractice. Doctors work in fear of litigation, often feeling it necessary to order a battery of unnecessary and expensive tests.10
An Era of Alienation
For the first time in history, we're a generation for the most part sadly lacking the wisdom of those who went before us. We've abdicated responsibility for our health and turned our health problems to teams of specialists. And these doctors, not being generalists, are not in a position to see us as whole human beings. The result is alienation of the doctor and the patient.
We seem to have lost touch with common sense, that inner knowing that tells us the baby's cold needs to run its course, or you should elevate your foot and stay off it if you've twisted your ankle—the doctor would say the same thing. There was a time when cultures relied on this most valuable resource, common sense.
We have lost sight of the fact that healing messages are as individual as the beings searching for a cure. Because of medical progress, it has become easier to view the human body as a machine: take a number, line up, cut it out, cut it off; get the broken or diseased part fixed; forget about it. We have become faceless body parts in the medical maze. Doctors admit that it's easy to forget there is a person attached to the gallbladder, the lung, the breast.
One surgeon shared a story with me. He had recently done a routine gallbladder surgery on an older woman. Returning to his office for the postoperative exam, she stood clutching her fine leather bag as he breezed in to check her progress. He started talking, asking her questions. He didn't recognize her, but that was a pretty common thing since he had gotten so busy. He didn't recognize lots of patients. But the woman