The American Experience with Complex Decisions: Past Examples
Throughout its history, the United States has acted decisively to confront major challenges, even in the absence of complete information or national consensus. National action has been initiated by a variety of individuals and institutions, including citizens, the private sector, and government. Sometimes new scientific knowledge was instrumental in prompting action, but in other cases political, corporate, or moral leadership responded decisively despite uncertain or incomplete scientific knowledge, potential costs, or conflicting public opinion.
This appendix describes five historical examples where the United States successfully confronted and overcame major national and international challenges. There are two reasons to pay attention to such case studies. First, analogies often play an important role in human decision making (e.g., Gentner et al., 2001; Vosniadou and Ortony, 1989). Second, historians and political scientists have identified a number of examples in which key leaders drew upon historical analogies to make decisions about national and foreign policy (Hacker, 2001; Houghton, 1996; Neustadt and May, 1986). Likewise, scientists, journalists, environmentalists, and labor, religious, political, and business leaders have often drawn upon historical analogies to help articulate and explain the climate change problem and its potential solutions (Sabin, 2010). Reasoning by historical analogy can be both useful and challenging: useful because analogies can at times help to identify and illuminate important problem features and potential solutions, and challenging because at other times analogies can misdirect attention and lead to the misapplication of the “lessons of history.” While climate change is a relatively new and highly complex problem—and in many ways remains unique—it too shares a number of similar features with prior national and international challenges.
Historical analogies also remind us that the United States has successfully overcome complex and difficult problems in the past. Each example below shares important similarities with the challenge of climate change; however, each also differs in important ways. Many other historical analogues have been used to think about climate change (e.g., the Manhattan Project, the Apollo Space Program, the New Deal), but they are not included here.
THE MONTREAL PROTOCOL
The Montreal Protocol on Substances that Deplete the Ozone Layer is often recognized as a potential model for climate change. Like climate change, ozone depletion is a global environmental threat. In this case, emissions of human-produced chlorofluorocarbons (CFCs) are destroying the ozone layer that protects the Earth’s surface from harmful solar ultraviolet (UV) light. Like the greenhouse gas (GHG) emissions that cause climate change, these emissions come from a variety of industrial processes taking place in both developed and developing countries, with the bulk of such emissions originating historically from the industrialized world. Also as in the case of climate change, scientific research discovered an unintended consequence of modern industrial activities that is largely invisible to the eye yet has potentially very serious global consequences. Furthermore, the early years of ozone layer research were filled with scientific uncertainties. For example, in 1974, based on laboratory research, chemists Mario Molina and Sherwood Rowland first hypothesized that CFCs were stable enough to rise to the stratosphere, where they would break down the Earth’s protective ozone layer (Molina and Rowland, 1974). Their research was roundly criticized by a number of companies that produced or relied upon CFCs. Nonetheless, the news media reported their hypothesis and identified common household products (such as aerosol spray cans) as one of the sources of CFCs. The public quickly responded, with many choosing to avoid CFC-based products. It was not until 1985 that British Antarctic Survey scientists finally discovered the formation of an ozone “hole” in the stratosphere over Antarctica (Farman et al., 1985). That same year, the Vienna Convention for the Protection of the Ozone Layer was negotiated and signed by many of the world’s largest emitters; this was quickly followed by the Montreal Protocol, which entered into force in 1989.
Technological innovation and market position played critical roles in the policy-making process, because the same companies that had produced CFCs invented more benign substitutes. The structure of the Vienna Convention was also important, as it included a periodic review of the evolving science, a structure by which the treaty could be revised and updated over time, and a special fund to assist developing countries in complying with the treaty. Over the years, as the science has progressed, the treaty has been progressively tightened to achieve a further and faster phase-out of ozone-destroying compounds. As a result, the Montreal Protocol has been hailed internationally as one of the most successful international agreements ever (DeSombre, 2000).
While there are some similarities between the problems of climate change and ozone depletion, there are also some very important differences (Bodansky, 2001; Grundig, 2006). For example, the problematic substances (CFCs) for ozone were produced by
a relatively small number of companies; substitutes were developed by these same companies (who stood to profit from the transition); and while CFCs were important to certain products and industrial processes, they were not fundamental to the operation of modern society. Climate change, however, is driven primarily by the burning of fossil fuels, for which there are currently few comparable alternatives; they are produced by some of the world’s largest companies; they provide the primary source of income for a number of key nations; and they supply the primary source of energy for the world. Furthermore, ozone depletion threatened significant personal health consequences because UV-B light is associated with increased rates of skin cancer. By contrast, while climate change is projected to have significant health consequences, the impacts will be neither universal nor as personally relevant to most Americans. People around the world could see themselves as more or less vulnerable to the risk of increased UV light due to ozone depletion, while the health risks of climate change are likely to be much more heterogeneous. In fact, studies have found that a majority of Americans currently believe climate change will have a small or no impact on human health, or they simply have no idea (Leiserowitz et al., 2009). Finally, while CFCs were used in some consumer products such as aerosol spray cans and refrigerators, fossil fuels power much of the world’s transportation system and electrical grid and provide key inputs into countless goods and foodstuffs (Sunstein, 2007).
THE ERADICATION OF SMALLPOX
Limiting the severity and adapting to the impacts of climate change will require the participation of individuals, organizations, and governments in every nation. The daunting scope of this task also has precedent, however. In 1979, the United Nations World Health Organization (WHO) formally declared victory in its 20-year campaign to eradicate smallpox worldwide.
Smallpox was one of the most deadly and contagious diseases known to humankind. It originated about 10,000 years ago and became endemic across Europe and Asia. Before widespread vaccinations became available during the 19th century, the disease killed about half a million people annually (0.5 percent of the population) in Europe alone. By the 20th century, smallpox still killed about 2 million people each year worldwide. In 1959 the United Nations began—and in 1967 greatly intensified—a campaign to eradicate the disease worldwide, a task made possible because smallpox exists only in humans and has no other carriers. Using extensive networks to reach every village on Earth, particularly in Africa and the Indian subcontinent, WHO teams identified each outbreak, isolated the victims, and vaccinated the surrounding population. Advertising campaigns and financial incentives encouraged even illiterate villagers
to quickly report any smallpox outbreaks. Near-universal participation was necessary because any unreported carriers could harbor the disease. After years of hard and coordinated work costing hundreds of millions of dollars, the campaign achieved a final success. The last naturally occurring case of smallpox was diagnosed and contained in Somalia in 1977 (Fenner, 1993; Oldstone, 1998).
Despite similarities in scope, however, the eradication of smallpox differs in important ways from efforts to reduce climate change. For example, the disease’s impacts were immediate and personal—the disease horribly disfigured and often killed its victims. Thus, individuals, communities, and entire nations could readily see and personalize the threat. Compared to climate change, the required responses—quarantine and vaccination—were relatively quick, were inexpensive, and did not fundamentally challenge existing social and economic patterns. Nonetheless, there are parallels to some of the risks associated with climate change, including increases in vector-borne or diarrheal infections that often afflict the poor; effective responses can reduce overall vulnerability to these impacts of climate change just as it did to smallpox. The eradication of smallpox also stands as an example of how even adversaries, such as the United States and the Soviet Union, could work together through the United Nations to eliminate a common threat to humanity.
THE CLEAN AIR ACT
Over the past 50 years, environmental protection has proven one of the great public policy success stories in the United States. In particular, the 1970 Clean Air Act and its major 1990 amendments have dramatically reduced unsightly and unhealthy air pollution at a cost representing a tiny fraction of the benefits produced.
In the 1950s and 1960s, the air above many American towns and cities had become deadly with increasing industrialization and automobile use. In 1966, an air pollution inversion killed 168 people in New York City. At times, ozone levels in Los Angeles’ air rose to five times above safe levels and visibility dropped to mere blocks. Noxious smog engulfed steel towns in the industrial heartland. During these two decades, the federal government established research programs to develop air pollution monitoring and abatement technology. California and other states began to regulate their emissions. In 1970, Congress passed the landmark Clean Air Act, authorizing the federal government and the states to regulate industrial and automotive emissions to meet national air quality standards (Krier, 1977). In 1990, Congress significantly enhanced the original act, in particular establishing a cap-and-trade system for sulfur
dioxide (the source of acid rain), a forerunner of the system some have proposed for limiting GHGs.
The 1970 clean air legislation occurred as one element in a social transformation—the rise of environmental consciousness in the United States. The first Earth Day was held that year. The Environmental Protection Agency was established in 1971. The Clean Air Act was a dramatic success. Since 1970, the economy has more than doubled in size, yet pollutants such as sulfur have dropped by a third, lead has dropped by 98 percent, and the air across the nation is visibly cleaner and meets health standards far more often. Over its first 20 years, the Clean Air Act is estimated to have cost the United States about $500 billion, while saving over $20 trillion, a benefit-cost savings of over 40 to 1 (EPA,1997).
Cleaning America’s air, however, also represents a different challenge than climate change. The Clean Air Act was largely a national project, achievable without the cooperation of other nations (although many other nations were undertaking parallel efforts). The impacts of dirty air were also far more visible and immediate to citizens than the impacts of climate change. In addition, the job of cleaning America’s air is still not complete. Many cities still consistently violate health standards, and, with a growing economy and traffic, continued improvements in air quality may require new technology transitions. Nonetheless, the Clean Air Act provides an example of a hotly contested environmental policy that transformed technology across many sectors of the U.S. economy, significantly reduced pollution at a fraction of the cost initially estimated by many, and has made a dramatic, observable difference in people’s lives.
THE TRANSCONTINENTAL RAILROAD
To address climate change, the government must help catalyze a technology and infrastructure revolution that will transform the way Americans produce and consume energy. This would not be the first time the U.S. government has facilitated such a transformation. The first transcontinental railroad, completed in 1869, is widely considered one of the great engineering triumphs of the 19th century. Over the following decades, the massive project successfully achieved the main goals of the policy makers who helped to finance it—linking the far-flung pieces of a country recently shattered by civil war and enabling the world’s first, and still strongest, continental economy (Ambrose, 2000; Bain, 1999; Goodrich, 1960).
The trip west to California by ship or wagon had previously taken months. The railroad reduced it to days. Visionaries had dreamed of a Pacific railroad since the 1830s, but the project’s risk and expense put it outside the reach of any private entrepreneurs.
Sectional disagreements over a northern or southern route prevented government action until, in the midst of the Civil War, Congress approved the Pacific Railroad Act, which incentivized private firms with large subsidies, paying them in cash and land for each mile of track laid. The government dictated the basic route (roughly following what is now Interstate 80) but left the details to the railroads. The legislation launched a process rife with amazing determination, thievery, heroism, cruelty, and corruption as multiple lines raced their way east and west, eventually meeting at Promontory Summit, Utah.
Addressing climate change will also require widespread deployment of new technology and infrastructure. Just as in the building of the transcontinental railroad, the government will need to chart a broad plan, provide incentives, and take some of the risk, while leaving the private sector to make most of the specific engineering and investment decisions. Building the transcontinental railroad, however, is only a partially useful analogue to climate change. The United States accomplished the endeavor alone, without need for cooperation with other nations. The project disregarded environmental concerns, the rights of indigenous peoples, the treatment of immigrant workers, and proper oversight of public funds that would prove rightly intolerable today. The benefits (and dangers) of the new railroad were immediate and largely obvious to all concerned. Yet the Pacific Railroad does stand as a towering example of policy makers successfully pursuing a long-term, transformational goal without a detailed long-term plan. Instead, the federal government provided strong financial incentives to the private sector that catalyzed the widespread deployment and use of a new technology that transformed the world.
WORLD WAR II
The massive national mobilization required to fight and win World War II may have some useful lessons for the prevention of catastrophic climate change (Bartels, 2001; Brown, 2009). Like WWII, reducing global emissions of GHGs will require a national focus, sense of urgency, dedication to success, cooperation with other countries, and national mobilization, including individuals and all levels of government, diverse economic sectors, and broad civil society. In response to the threat of Nazi Germany and Imperial Japan, and after the tragedy of the attack on Pearl Harbor, the United States literally reinvented and reorganized itself. Within months, many factories had been retooled from commercial to military purposes. During the war, millions of American men and women were drafted or volunteered for military service, while millions more worked on the home front in support of the war effort, including in factories, on farms, and in construction (Gropman, 1996; Koistinen, 2004). A wide variety of commodities
were rationed, including cars, fuel, food, and clothing, and many Americans planted “Victory Gardens” to feed their families during the war. Moreover, the outcome of the war itself was deeply uncertain, and, as it proceeded, Americans endured and surmounted a number of major military setbacks and losses. Nonetheless, the country and its leaders were willing to act despite these enormous uncertainties and large costs in both human lives and national treasure. Moreover, the United States partnered with the other Allies, including ideological foes like the Soviet Union, to defeat their common enemy. Winning WWII thus required an extraordinary level of coordination both within the United States and internationally. And in the process, the United States reinvented itself, emerging from the war as a global military, economic, and cultural superpower.
Preventing dangerous levels of climate change will also require changes in the way American society produces and consumes energy and significant changes across economic and political sectors, both within the United States and internationally. While WWII reminds us what the United States can achieve when it is motivated, it is also important to recognize that climate change presents a different set of challenges. In WWII, the United States faced an existential threat from other human beings—namely the Axis powers, led by Hitler, Mussolini, and Hirohito—an enemy that was easy to understand, vilify, and mobilize the nation to fight. By contrast, climate change does not have an easily identifiable villain. In the words of the cartoon character Pogo, “we have met the enemy and he is us.” Most human activities in the modern world result in the release of GHG emissions. While fingers of blame are often pointed at particular leaders, industries, and entire nations, the truth is that almost all human beings are complicit, albeit to widely differing degrees, in the problem. Risk-perception researchers have also found that human beings are generally more sensitive to and concerned about threats from other human beings or human technologies than from natural hazards, which are often viewed more fatalistically as uncontrollable acts of nature or God (Slovic, 2000). Unlike the bombing of Pearl Harbor or the terrorist attacks of September 11, 2001, climate change will manifest primarily as more frequent or severe natural hazards (e.g., heat waves, droughts, floods, disease outbreaks, etc.)—harm by a thousand (seemingly natural) cuts rather than a single catastrophic event. Furthermore, while fascism was easily understood as a direct threat to the nation’s security (and one’s own liberty), climate change is currently perceived by many as a threat to unseen others (future generations, people, and species far away), although it is increasingly raised as a new threat to national security (Fingar, 2009; Leiserowitz et al., 2009). Finally, Americans’ response to WWII was deeply rooted in the values of self-defense, patriotism, and national pride. The fight against climate change, however, has not yet tapped into these core values. Nonetheless, WWII stands as a powerful reminder that
when the United States is sufficiently motivated and mobilized, it can literally reinvent and transform itself and the world with speed and innovation.
Climate change presents a technical, social, and political challenge that is in some ways similar to—although in other ways quite unique from—many challenges the United States has faced before. The United States has the proven ability to revolutionize technology and the nation’s infrastructure, mobilize around a common purpose, work with other nations to combat common threats, and solve major environmental problems at far less cost than originally expected. Previous generations have successfully addressed problems of similarly daunting complexity, uncertainty, and scale.