As early as 1842, an English social reformer named Edwin Chadwick called for replacing cesspools and privy vaults with a citywide sewage system consisting of pipes that would use household water to transmit fecal matter and other waste to a single location where it could be composted and sold as fertilizer.33 Chadwick himself failed to understand the dangers of drinking contaminated water; instead, he believed that cholera, typhoid fever, dysentery, and other waterborne diseases were spread through “foul air,” so his goal was to move sources of contagion downwind of the cities.34
The first proof that cholera was spread by contaminated drinking water had to wait until 1854, when an English physician named John Snow found that nearly every victim of a cholera epidemic in south London lived near and drank water from a single well. The well was later found to be only three feet from a cesspool that had been contaminated by cholera.35 Although Snow’s report led the city to remove the pump handle from that particular well, Snow’s theory of waterborne contamination remained controversial for many years.
Although most American waterworks were eventually taken over by city governments, they could potentially be private and their costs, even when government owned, have largely been paid for out of user fees. From a public health view, however, voluntary user fees might be inadequate to pay for sewers; as long as anyone could refuse to pay the fee and continue to dump their wastes in a cesspool, water supplies could potentially become contaminated. Chadwick’s hope that fertilizer sales would cover the costs of a citywide sewage system was unrealistic, and the high cost of sewers combined with debates over the actual causes of diseases kept most American cities from installing universal sewage systems for several decades.
Chicago, for example, built one of the first American municipal sewer systems dealing with human wastes in 1856.36 Yet 37 years later, a survey found that nearly three out of four Chicago residents still relied on privies rather than indoor plumbing.37 The reason was Chicago paid for the sewers out of user fees, and the only homes connected to the sewage system were those whose owners could afford to pay for indoor plumbing and hookup fees.38 It was not until 1902 that Chicago mandated that all new homes be hooked up to sewer systems. This law exempted existing homes, and by increasing the cost of new housing, it made it more difficult for working-class families to buy a home.39
The cost of indoor plumbing fixtures and connections to city water and sewer systems could nearly double the price of a small, single-family home.40 Mandating such hookups would price many working-class families out of the homeownership market.
Boston took a different approach from Chicago’s. Though it did not begin to construct an integrated sewer system until after the Civil War, when it did so it paid for the capital costs out of general funds, meaning, mainly, property taxes.41 This payment method meant that owners of expensive homes effectively subsidized owners of smaller homes. Low-income families who sought to build a home still had to pay for indoor plumbing fixtures and to pay the city for water meters and operating costs.
In the long run, the public sewer model had an even more profound effect on housing costs. The perceived need for publicly owned, centralized sewer systems led to a significant growth in city government. Cities hired sanitary engineers who attempted to forecast future needs and write long-range plans to meet those needs.42 These long-range plans set a precedent for later city plans and increasingly specific regulations written to deal with such things as transportation, land uses, parks, historic buildings, watersheds, and trees. Although no one can argue the public health benefits of integrated sewage systems, later policies that, for example, imposed “impact fees” on homebuilders to pay for transportation or created time-consuming permitting processes for the cutting of individual trees significantly increased the costs of homeownership while providing dubious benefits.
Housing for Factory Workers
In the short run, another late 19th-century trend had an even larger effect on housing affordability: the growth of the factory system. The nation’s first factory, the Slater Mill in Pawtucket, Rhode Island, employed just nine workers. By 1880, factories remained small enough that close to 90 percent of the families of, say, Detroit could live in individual homes and workers would still be within walking distance of their places of employment.43
However, the average number of employees per factory more than doubled between 1869 and 1899 and continued to grow after that.44 By 1904, 60 percent of all manufacturing employees worked in factories with more than 100 employees, and 12 percent worked in factories with more than 1,000 employees.45 Moreover, the tendency of many industries, such as Chicago’s stockyards, to concentrate in one part of a city created transportation problems for workers.
Although some cities saw the installation of horsecars as early as 1850 and rapid growth of electric streetcar networks after 1890, factory workers earning between $3 and $6 a week could not afford to devote 10 to 20 percent of their incomes to transit fares. The limited amount of land within walking distance of factories, and the resulting high cost of such land, forced many to live in high-density tenements instead of single-family homes.
The word “tenements” brings to mind extremely high-density mid-rise buildings housing several families per apartment, or even per room, in New York City’s Lower East Side. Reformer Jacob Riis photographed residents of these buildings in the late 1880s and early 1890s. His 1889 book How the Other Half Lives: Studies among the Tenements of New York included several of these photos and raised public attention about poor housing conditions and influenced the passage of several tenement laws.46
Riis’s tenements were five to six stories high built on 25-by-100 foot lots, meaning about 16 could fit on a single acre. The front and back of the buildings occupied the full width of the lot with a small airshaft between the buildings in the center. Because they were narrow in the middle to make room for the airshaft, these buildings were often called “dumbbells.”47 Designed to house four families to a floor, or up to 24 per building, they were sometimes packed with far more. The narrow airshafts meant that most rooms had little light, and the odors from the garbage that people inevitably threw to the inaccessible bottoms of the shafts must have been stifling. Many of these tenements had no indoor plumbing; those that did might have only one toilet per floor. Perhaps most scandalous to the middle-class readers of Riis’s book was the lack of privacy: children of all ages and both sexes often slept in the same rooms as their parents, other relatives, and unrelated boarders.
New York City tenement conditions were a direct function of the density of inner-city jobs. The Triangle Shirtwaist factory, site of the infamous fire that killed nearly 150 workers, occupied 3 floors of a 10-story building in lower Manhattan and employed 500 workers. The building covered about one-quarter acre out of the nearly 7,000 developed acres in Manhattan. Although pre-1890 factories would have fewer floors, even six-story factories could contain 4,000 workers per acre, most of whom had to live within walking distance of the buildings.
In 1910, the year before the Triangle Shirtwaist fire, Manhattan had about 2.3 million residents at an average population density of about 100,000 people per square mile. (For comparison, the median density of the nation’s 50 largest urban areas in 2000 was about 2,800 people per square mile.) Manhattan today has about 2.3 million jobs, and certainly nearly all the pre-1900 residents who had jobs worked in Manhattan. The combination of offices, factories, and residences competing for space made lower Manhattan some of the most valuable real estate in the world. Factory workers who couldn’t afford to commute off the island were forced to live in high-density environments, which is why the Lower East Side housed some 334,000 people per square mile in the 1880s.
The problem was compounded by 19th-century construction technology, which made it difficult to build structures taller than about six stories. (The nation’s first steel-framed skyscraper, St. Louis’s 10-story Wainwright Building, was built in 1891; the tallest commercial masonry building ever built, Chicago’s Monadnock Building, is also 10 stories and was also built in 1891.) Residents typically need a lot more space than factory workers; given Manhattan’s high land prices, the tenement was