Since the early twentieth century, a tension has existed within the public health field—which mirrors a societal one—between, on the one hand, those who set their sights on prevention of disease and conditions dangerous to health through society-wide efforts and, on the other, those who believe in the more modest and pragmatic goal of ameliorating conditions through piecemeal reforms, personal education, and individual treatments. Despite the tremendous successes of environmental and political efforts to stem epidemics and lower mortality from infectious diseases, the credit for these improvements went to physicians (and the potent drugs they sometimes had at hand), whose role was to treat individuals. This shift also coincidentally, or not so coincidentally, undermined a public health logic that was potentially disruptive to existing social and power relationships between landlord and tenant, worker and industrialist, and poor immigrants and political leaders.
At elite universities around the country—from Harvard, Yale, and Columbia to Johns Hopkins and Tulane—new schools of public health were established in the first two decades of the twentieth century with funds from the Rockefeller and Carnegie Foundations. Educators at these new schools had faith that science and technology could ameliorate the public health threats that fed broader social conflicts. They envisioned a politically neutral technological and scientific field removed from the politics of reform. The Johns Hopkins School of Hygiene and Public Health was at the center of this movement. William Welch, the school’s founder and first director (as well as the first dean of the university’s medical school), argued persuasively that bacteriology and the laboratory sciences held the key to the future of the field.6 By the mid-twentieth century, municipal public health officials in most cities had adopted this approach. If early in the century public health workers in alliance with social reformers succeeded in getting legislation passed to control child labor and the dangers to health that accompanied it, and to protect women from having to work with such dangerous chemicals as phosphorus and lead, by midcentury departments of health worked more often to reduce exposures of workers to “acceptable” levels that would limit damage rather than eliminate it. Similarly, by the 1970s departments of health had established clinics aimed at treating the person with tuberculosis but displayed little interest in joining with reformers to tear down slums and build better houses for at-risk low-income people.7
By the 1950s and 1960s, when childhood lead poisoning emerged as a major national issue, public health practitioners were divided between those who defined their roles as identifying victims and treating symptoms and those who in addition sought alliances with social activists to prevent poisoning through housing reforms that would require lead removal. Drawing on the social movements of the 1960s, health professionals joined with antipoverty groups, civil rights organizations, environmentalists, and antiwar activists to struggle for access to health facilities for African Americans in the South and in underserved urban areas, for Chicanos active in the United Farm Workers’ strikes in the grape-growing areas of California and the West, for Native Americans on reservations throughout the country, and for soldiers returning from Vietnam suffering from post-traumatic stress disorders, among others. By the end of the twentieth century, though, the effort to eliminate childhood lead poisoning through improving urban infrastructure had largely been abandoned in favor of reducing exposures.
CHILDHOOD LEAD POISONING: PUBLIC HEALTH TRIUMPH OR TRAGEDY?
The campaign to halt childhood lead poisoning is often told as one of the great public health victories, like the efforts to eliminate diphtheria, polio, and other childhood scourges. After all, with the removal of lead from gasoline, blood lead levels of American children between the ages of one and five years declined precipitously from 15 micrograms per deciliter (μg/dl) in 1976–80 to 2.7 μg/dl by 1991–94,8 and levels have continued to drop. Today, the median blood lead level among children aged one–five years is 1.4 μg/dl, and 95 percent of children in this age group have levels below 4.1 μg/dl. Viewed from a broader perspective, however, the story is more complicated, and disturbing, and may constitute what Bruce Lanphear, a leading lead researcher, calls “a pyrrhic victory.”9 If 95 percent of American children have below what is today considered the danger level for lead, then 5 percent—a half million children—still have dangerous amounts of lead in their bodies. A century of knowledge about the harmful effects of lead in the environment and the success of efforts to eliminate some of its sources have not staunched the flood of this toxic material that is polluting our children, our local environments, and our planet.
FIGURE 1.Rates of lead poisoning, 2003. These rates are based on the CDC’s 2003 level of concern (10 µg/dl). In 2012, the CDC lowered that to 5 µg/dl, increasing the number of at-risk children from approximately 250,000 to nearly half a million. Source: Environmental Health Watch, available at www.gcbl.org/system/files/images/lead_rates_national.jpg.
Today, despite broad understanding of the toxicity of this material, the world mines more lead and uses it in a wider variety of products than ever before. Our handheld electronic devices, the sheathing in our computers, and the batteries in our motor vehicles, even in new “green” cars such as the Prius, depend on it. While in the United States the new uses of lead are to a certain degree circumscribed, the disposal of all our electronic devices and the production of lead-bearing materials through mining, smelting, and manufacture in many countries continue to poison communities around the world. Industrial societies in the West may have significantly reduced the levels of new lead contamination, but the horror of lead poisoning here is hardly behind us, exposure coming from lead paint in hundreds of thousands of homes, in airborne particles from smelters and other sources and from contaminated soil, lead solder and pipes in city water systems, and some imported toys and trinkets. Over time, millions of children have been poisoned.
In the past, untold numbers of children suffered noticeably from irritability, loss of appetite, awkward gait, abdominal pain, and vomiting; many went into convulsions and comas, often leading to death. The level of exposure that results in such symptoms still occurs in some places. But today new concerns have arisen as researchers have found repeatedly that what a decade earlier was thought to be a “safe” level of lead in children’s bodies turned out to itself result in life-altering neurological and physiological damage. Even by the federal standard in place at the beginning of 2012 (10 μg/dl), more than a quarter of a million American children were victims of lead poisoning, a condition that almost a century ago was already considered with some accuracy as totally preventable. Later in 2012, the Centers for Disease Control (CDC) lowered the level of concern to 5 μg/dl, nearly doubling estimates of the number of possible victims.
The ongoing tragedy of lead poisoning rarely provokes the outrage one might expect, however. If this were meningitis, or even an outbreak of measles, lead poisoning would be the focus of concerted national action. In the 1950s, fewer than sixty thousand new cases of polio per year created a near panic among American parents and a national mobilization of vaccination campaigns that virtually wiped out the disease within a decade. At no point in the past hundred years has there been a similar national mobilization over lead despite its ubiquity and the havoc it can wreak.
For much of the twentieth century we have no systematic records telling us the number of children whose lives have been destroyed by lead. What we have known, as one researcher put it in the 1920s, is that “a child lives in a lead world.”10 By the 1920s, virtually every item a toddler touched had some amount of lead in or on it. Leaded toy soldiers and dolls, painted toys, beanbags, baseballs, fishing lures, and other equipment that were part of the new consumer economy of the time; the porcelain, pipes, and joints in the sparkling new kitchens and bathrooms of the expanding housing stock—all were made of or contained large amounts of lead. But more