to nearby waterways (Tarr, 1979). Separate sanitary sewers (that conveyed mainly waste from homes and businesses) were built in several dozen cities because they were less expensive and the concentrated wastes could be used as fertilizers (Tarr, 1979). By 1890, approximately 70 percent of the urban population lived in areas that were served by one of the two types of sewer systems (Figure 2-1).
Throughout this period, the wastes conveyed by combined sewer systems were usually discharged to surface waters without any treatment because the available treatment methods (e.g., chemical precipitation) were considered to be too expensive (Billings, 1885). As a result of the rapid growth of cities and the relatively large volumes of water discharged by sewers, drinking water supplies of cities employing sewers and their downstream neighbors were compromised by water-borne pathogens, resulting in increased mortality due to waterborne diseases (Tarr et al., 1984). For example, severe outbreaks of typhoid fever in Lowell and Lawrence, Massachusetts, in 1890 and 1891, in which over 200 people died, were traced back to the discharge of sewage by communities located approximately 12 miles (20 km) upstream of Lawrence (Sedgwick, 1914).
In cities with separate sanitary sewers, treatment was more common because of the smaller volumes and
FIGURE 2-1 Comparison of total U.S. population with urban population, population served by sewers, population served by water treatment plants, and population served by wastewater treatment plants.
SOURCES: Tarr et al. (1984), (EPA, 2008b).
consistent quality of the waste. In some communities, sewage was applied directly to orchards or farms (in a practice known as sewage farming (Anonymous, 1893; see Box 2-1). Sewage farming led to high crop yields, especially in locations where water was limited. The nutrients in the sewage made sewage farming attractive to farmers, but the practice eventually died out in the 1920s as public health officials expressed concerns about exposure to pathogens in fruits and vegetables grown on sewage farms.
As downstream communities became aware of the impact that upstream communities were having on their water supplies, there were debates about the obligations of communities to remove contaminants from sewage prior to discharge. Leading engineers, such as Allen Hazen, advocated for downstream cities to install drinking water treatment systems (Hazen, 1909) while public health scientists, like William Sedgwick (1914), advocated a requirement for cities to treat sewage. Many sanitary engineers supported their assertion that wastewater treatment was unnecessary by a belief that flowing water undergoes a process of self-purification. They asserted that as long as a water supply was located at a sufficient distance downstream of the sewage discharge, the water would be safe to drink. In fact, this concept was instrumental in the state of Massachusetts’ policy of allowing sewage discharges to rivers if the outfall was located more than 20 miles (32 km) from a drinking water intake (Hazen, 1909; Sedgwick, 1914; Tarr, 1979). As a result of these debates, downstream communities often took the responsibility for ensuring the safety of their own water supply by building drinking water treatment plants or relocating their water supplies to protected watersheds.
Emergence of Wastewater Treatment
In 1900, less than 5 percent of the municipal wastewater in the United States was treated in any way prior to discharge (Figure 2-1). However, increases in population density, especially in cities, coupled with the growth of the progressive movement, which created a greater awareness of natural resources, led to increased construction of wastewater treatment systems (Burian et al., 2000). Coincident with these trends was the development of more cost-effective methods of biological wastewater treatment, such as activated