This appendix contains the scenarios referred to in the Preface and Executive Summary. The participants listed in Appendix B developed them at a fall 2002 workshop.
About Scenario Thinking
Scenario planning is a highly interactive process that is intense and imaginative. The idea of scenarios is to tell possible stories about the future. A scenario is a tool for ordering one’s perceptions about alternative future environments in which today’s decisions might be played out.
The initial phase usually involves rigorously challenging the mental maps that shape one’s perceptions. A good scenario planning project expands our peripheral vision and forces us to examine our assumptions and to practice what we would do if the unthinkable happened—a condition that occurs more often than one might imagine.
In the body of the process, groups identify driving forces (social, economic, political, and technological) and the factors that shape those forces. These factors are then prioritized according to importance and uncertainty.
Each scenario should represent a plausible alternative future, not a best case, worst case, or most likely continuum. Most important, the
test of a good scenario is not whether it portrays the future accurately but whether it provides a mechanism for learning and adapting.
THE NEXT SCIENTIFIC REVOLUTION
In the past few years we have discovered that the universe is expanding at an accelerating rate and is probably flat rather than curved. We have found evidence of neutrinos with mass. We have completed the first draft of the human genome and started building the fastest computer to solve even more complex problems in biology. In physics, profound changes in our understanding are unfolding at the very large scale of the universe and at the very small scale inside subatomic particles. One of the most challenging problems in physics today is the apparent incompatibility between general relativity, which describes the nature of the large-scale universe very well, and quantum mechanics, which is useful at the very small scale. Like physics, chemistry is benefiting from new computational methods and the ability to manipulate matter at the very small scale. We can simulate chemical reactions and structures that enable us to try many more alternatives in virtual labs than we could ever do in the real world. Research proceeds faster and better. We have new tools to see and manipulate individual molecules. In biology the revolutionary dynamics are similar, with the added dimension of the decoding of the human genome. Advances over the past few decades in genetics and molecular biology have enormously expanded our understanding and control of biological systems. Once biology was mainly an empirical science. We could only observe what is. Unlike physics, we could not reliably predict and control the behavior of systems. Increasingly biology is becoming a quantitative science like physics, with higher levels of ability to predict and control.
We imagine that the new ideas and discoveries in physics, biology, chemistry, and mathematics are leading to a revolutionary “moment” where we will reconceptualize and reperceive reality. We have been here before. One of the consequences of the revolution in physics at the beginning of the 20th century was that reality got weird. In the 19th century, physics made the world more comprehensible. It was assumed that the real world was like a vast clock mechanism. If we identified all its pieces and figured out how they worked, we would understand the
nature of things. It was not hard to visualize the causal cascade of events that explained why things are as they are. Then along came relativity and quantum theory. Suddenly space and time became malleable and fluid rather than the fixed framework within which everything else happened. Heisenberg showed us that, thanks to the dynamics of very small things, it was impossible to know just where a particle was and how fast it was moving in what direction. Uncertainty is in the nature of things. We now have great difficulty in imagining the workings of physical reality. After all, can anyone really imagine the big bang … a unique discontinuity in the fabric of space-time that suddenly blew up in an explosion of vast energy to create our universe?
In the West, one of the earliest models of the universe was that of Ptolemy. It worked fairly well, except for the fact that it assumed the earth was at the center of the universe and everything rotated around us. As astronomy got more sophisticated, we had to invent ever more elaborate mathematical models to make Ptolemy’s picture of reality work. The astral cycles of heavenly movement became cycles within cycles within cycles. Until finally Copernicus said it would all be much simpler if the sun were at the center and we went around it. The way we imagined the workings of the universe literally shifted and it was simple again in both perception and mathematics.
The present moment in science has the whiff of Ptolemeic epicycles about it. Perhaps the universe is actually incredibly complex and incomprehensible. Or just maybe it is our models that have become complex and incomprehensible. Perhaps the new theories will yield ways of seeing things that are not as simpleminded as the clockwork universe of the 19th century or as illusive as the unimaginable world of the 20th century. In our new understandings of the relationship of the very large to the very small we may literally revisualize the universe around us.
Scientific advances often rely on the engineering of new tools to open new arenas for exploration. The telescope and the microscope changed our view of the very large and the very small. Atom smashers, now called particle accelerators, made it possible to explore the interior of the atom. X-ray crystallography made possible the discovery of the double helix of the DNA molecule.
Today new scientific tools are leading to profound insights into the realities of nature. The Hubble telescope and other space-based as well as terrestrial astronomical instruments have taken us almost all the way to the edge of the universe and back to its origins in the big bang. Over the coming decade we anticipate building several huge new telescopes both on the ground and in space. The scanning tunneling microscope has given us the ability to see and manipulate individual atoms. New imaging techniques have allowed us to observe the dynamics of very fast chemical reactions.
The most important new tool, however, has been the computer. It has given us the ability to create remarkably faithful simulations of phenomena like turbulent flow and complex chemical reactions. It has allowed us to study very large numbers of examples, millions instead of dozens. It has given us the control systems to manipulate very complex and/or microscopic processes. Indeed, hard scientific problems continue to help advance the frontiers of computing power. Using today’s supercomputers already allows us to simulate many thousands of potential chemical reactions and focus on the most productive few to test on the lab bench. IBM is currently building the world’s most powerful computer, to be known as Blue Gene, to simulate the complex three-dimensional geometry and dynamics of protein molecules.
The Internet, of course, was primarily a tool for accelerating scientific communication. It used to be that the process of research involved a long cycle of peer review, publishing, challenging, retesting, publishing, and so on. It was a process that took months or even years. Today an experiment is conducted in the morning, and the results can be on the Internet by lunchtime. By midafternoon they might be validated elsewhere in the world and confirmed by the end of the day.
Ideas can be key tools, too. An important new idea has emerged in recent years—the mathematics of chaos. A whole new way of seeing change in evolving systems has been rigorously described by this new mathematical discipline. Two new principles of change emerge from chaos theory. Small changes at the beginning of a process of evolution can have very large effects much further downstream. Second, the outcome of a process is dependent on the path it took to get there. Small, almost random changes accumulate over time to make the developmental path of every system in nature unique, if only slightly. On every tree the leaves are very similar but not identical. In the universe there are many spiral galaxies, but none duplicate our own Milky Way galaxy.
Their uniqueness is a function of all the unique events along the way for each system that add up to a slightly different shape and size for the leaf and the galaxy.
A Broad, Deep Revolution
Part of what makes one imagine that a revolution is coming is that fundamental changes are developing across many disciplines. Here the focus is on physics, biology, and chemistry, but many others could be covered. Furthermore, they are feeding on each other.
The powerful new tools of astronomy and astrophysics, which include not only the Hubble and other huge ground-based telescopes like the Keck, but also instruments that explore in other spectra like x-ray, gamma-ray, and radio, are leading to often surprising discoveries. Among other things we have found that the universe is expanding at an accelerating rate rather than a constant or decelerating rate. This challenges our model of the composition of the universe. Some unknown force is causing the acceleration. The new discoveries are forcing us to reconsider our models of the very large-scale universe.
At the other end of the scale, new findings in particle research and theories like superstrings are leading to revisions of our ever more subtle model of the subatomic realm. Indeed, superstring theory has the potential to be the idea that breaks through the dilemma of the conflict between relativity and quantum theory. It may be the unifying theory of everything. (For a good book on the subject, see Brian Greene’s The Elegant Universe, Vintage Publishing, 2000.)
In the new physics there are also hints of what could become technologically possible. Certainly superconductivity at room temperature seems plausible. New ways of transforming energy also seem plausible. The chemistry that lies ahead will have two new distinct properties. We are increasingly going to be able to predict the dynamics of chemical reactions and the properties of the resulting new materials using computational methods. It means that we will be able to design new materials never before seen. In the current research on carbon nanotubes, we are already seeing this process at work. Second, we will have much greater fine grain control of molecular dynamics and structures. We are going to be able to maneuver molecules for many purposes from testing to molecular construction. So the new materials we invent on the computer we will actually be able to make. The revolution lies in the move-
ment of chemistry from the realm of vast numbers of molecules colliding at random to a chemistry that treats individual molecules as meaningful entities. Imagine building with Legos rather than making soup.
In biology the revolution lies in being able to understand the myriad biological processes and structures that shape our world and lives. Among the key outcomes, of course, will be radical new medical therapies, from being able to grow custom replacement tissue to curing most diseases and even reversing aging.
The intersection of the new physics, chemistry, and biology is in nanoscience and technology. The idea is to make functional devices at the atomic and molecular scales. A variety of biological molecules and systems provide useful models. In the new understanding of chemistry we are learning how to align and manipulate molecules in new ways. In the new physics we are gaining insight into the nature of the atoms themselves. The results are amazing new devices, for example, sensors that can detect a single molecule in a gas.
When Eric Drexler first articulated his vision of nanotechnology in 1986, most physicists derided it as implausible. Because of the irrepressible movements of very small things, you could never control them or make them stable enough to build nanoscale machines, even if nature did. Fortunately, they were wrong. Nanoscience has moved from the wild fringes of science to the mainstream faster than any idea in history.
The Singularity Scenario
The outcome of these revolutionary dynamics is uncertain. Perhaps the theorists who argue that there is no new revolution are right. It is all the details from here on or we will never figure it out, but in any case the story is mostly over. Well, that is one scenario in which case we are unlikely to be surprised by much in the coming decades. The world of 2020 will mostly resemble today, maybe a bit more technological presence, but no radical departures from the current science and technology.
Or maybe these developments have the potential to be revolutionary but remain quite distinct from each other. They develop over a much longer period and perhaps a century from now will have accumulated into something significant but difficult to glimpse. In any event the world of 2020 is likely to be fairly familiar with only a few surprises along the way. Given what we now know, the surprises are most likely to
come in biology. So we could imagine some fairly radical developments such as cures for many diseases and even the beginnings of age reversal and life extension.
But what if these developments are truly revolutionary and they interact to feed on each other along with many other fields to drive an explosive period of change. In part, of course, the drive comes from the high rewards now available for new technology. But ideas, talent, power, and wealth could combine to fuel a revolution in science and technology. It could create a conceptual “singularity,” as Vernor Vinge described this moment. It becomes progressively more difficult to imagine the emerging world as the pace and magnitude of change continue to accelerate. Like the Spanish peasant, we may want to hug the ground as the emerging universe whirls around us.
We can imagine the consequences of some of the new world view. We would inhabit a highly interconnected universe in which human action matters but one in which control is extremely difficult. We would have remarkable new capabilities to manufacture new devices and new materials in entirely ecologically benign ways. We might even have new clean energy sources. It would be as different from today as a world of airplanes and automobiles was from the horse and buggy and steamships.
THE BIOTECHNOLOGY REVOLUTION IN A SOCIETAL CONTEXT
At 9:00 p.m. the alarm from Katherine’s internal digital-clock appendage startles her, making her nearly drop the culture she’s been working on for the past two hours. After a morning of answering questions from the press, an afternoon of virtual meetings with other scientists in France, home of the world leaders in neuromachines, the only time she can get to make use of her doctorate in biomechanics is late in the evenings. Katherine reluctantly pulls herself from her work and carefully stores the culture so she can finish her experiment in the morning. She removes her lab coat and exits the lab, headed toward her corner office. She passes a line of offices where others are still engrossed in their work. One woman is on the phone setting up meetings with the congressional Bio-Machines Oversight Committee. Another woman tries to finish an article for the Times on the company’s current research into external uteruses and its potential effects on unborn fetuses and gender inequalities in the workplace. A gentleman is finishing paperwork to be submitted to the company’s pro-bono division on the long-term effects of a $40 million donation he oversaw to distribute the AIDS vaccine in Kenya and Zimbabwe.
Katherine slowly walks through the front door. She’s tired but decides to not spring for another dose of caffeine. Her vitals monitor tells her that her heart rate is low and her blood sugar and potassium
levels are too low—precisely at the right levels for deep REM sleep. She waits impatiently for two minutes as the valet retrieves her car. Before she left her desk, Katherine told her computer via the brain-com interface to have her car waiting for her at the North Lobby. The delay caused by the congressionally mandated requirement that only humans drive cars frustrates her—machines are more reliable and efficient, she thinks to herself. Finally, the valet driving her car arrives. She tips him and then pushes the “home” button on her GPS navigator. The car slowly exits the parking lot. “I’m still pushing buttons,” she thinks to herself. If she had the latest model car with the brain-com interface introduced by her company, she wouldn’t have to press a button. Ten minutes later she’s home.
Katherine unlocks her apartment door via the fingerprint-encoded lock, drops her things, then rushes to the computer to download her thoughts about exactly where she was in the experiment so she can start back up the next day. As her brain is scanned, she also tells her microwave to heat some milk for her to drink to help her off to sleep. Before bed, she prints a copy of the day’s newspaper. Though paper is a thing of the past, reading from printed sheets reminds her of her childhood and helps her get to sleep.
“Wednesday, March 15, 2018,” she says aloud. “What a day in history!” Five years ago, to the day, marked the beginning of the second attack into Iraq. The Baath Party reemerged to seize the government despite the efforts of the United States and the United Nations to establish democracy following the first war in 2003. Following evidence that new programs to field nuclear, chemical, and biological weapons had emerged, the UN, led by the United States, sent in bombers, followed by ground troops to oust the Baath Party again. Unwilling to yield so quickly this time, party leaders ordered the launch of biological weapons on the U.S. troops and also sent drones over Israel to spray a deadly, genetically modified variant of Ebola that Iraqi scientists were able to develop through advances using stem cells. Ebola-C, as it was called in this country, a powerful virus that could spread through any bodily fluid exchange, would attach itself to vulnerable cells in the body and modify its genetic information such that the cells it would spawn would burst, spreading the virus further through the body until a major organ was destroyed. Death was certain within one to six weeks. Five hundred thousand troops, 79 percent of the troops sent to the Middle East, were killed by the initial release. One million people in Israel were killed. The
virus spread through many parts of the Middle East, leading to the deaths of 3.5 million more people within three months.
Galvanized by the threat of a worldwide pandemic, public health organizations around the world focused on finding a cure for the Ebola-C virus. Fortunately for the United States, the disease did not travel to the States. Nevertheless, the public was stunned by the swiftness and scope of the attack and was intent on redoubling efforts to advance the country’s knowledge of the life sciences to defend against similar terrorist attacks. To facilitate R&D, a major policy change was made to encourage stem cell research. Shortly thereafter, federal research funds, on par with funds invested during the cold war, were provided specifically for research in the life sciences. Los Alamos, Sandia, and Lawrence Livermore national laboratories and the National Aeronautics and Space Administration developed biological sciences programs, recruiting top researchers, M.D.-Ph.D.s, biologists, chemists, physicists, and engineers to the task of building up America’s life sciences knowhow. All of the top universities expanded their degree programs in bioengineering—biomechanical, bioelectrical, biochemical engineering. The race for a cure for Ebola-C, and for further advances in life sciences, had begun. In four short months a cure and a vaccine for Ebola-C were developed in France, where researchers had long been doing stem cell research and where many top U.S. scientists had emigrated so they could continue their work in the early 2000s. The United States continued to press fundamental life sciences research and by 2017 was again the world leader in the life sciences arena. Cures and vaccines for AIDS, cancer, anthrax, and most of the other deadly diseases had been developed.
Flush with success but somewhat removed in time from the last warfare/terrorist event, the United States established many governmental oversight committees to monitor and direct research dollars, fueled by religious conservatives still concerned about potential “creation of life” abuses and by antiglobal business activists protesting the profits of U.S. biotechnology companies. Universities began to develop degree programs in ethics, life sciences, and society; courses in biological advances were added to liberal arts programs; and courses in ethics were made requirements in all science and engineering degree programs. By 2017 there were a plethora of bio start-ups and bio products on the market, selling everything from internal vital monitors, weight loss drugs to reprogram the body to stop creating fat cells, memory-enhancing drugs, biomaterial prostheses, and replacement internal organs. The year
2018 brought the release of brain-machine-interacting products, following the huge leaps that had come in understanding the brain and advances in nanotechology and microelectronic mechanical systems. Brain-machine communication devices for computers became commonplace, leading to decreased sales of keyboards. In the homes of those with the highest incomes, it’s not uncommon to find the entire home equipped with wireless brain-com controllers for TVs and kitchen appliances.
Katherine sipped her milk and skimmed through the day’s national headlines. “Chess Olympiad’s Medal Rescinded After Use of Memory Appendage,” she reads. “Religious Conservatives Protest Baby-Engineering Research.” Not tonight, she thinks to herself. On the education pages she sees the title, “Engineering Schools Recruiting Male Applicants into Life Sciences Careers.” She smiles to herself, remembering the late 1990s when men largely dominated engineering classes at most universities. When Katherine entered MIT in 2002, near the beginning of the life sciences boom, she was one of the few women double majoring in physics and biology. She and many of her female classmates sought careers that would have an impact on people’s lives and thus chose to enter the bio arena for graduate school. By the time Katherine was in graduate school, the number of women in engineering degree programs equaled that of men nationwide—with many women concentrated in the engineering aspects of the life sciences. Many of her colleagues garnered technical jobs; others became part of the political landscape by joining the staffs of congressional oversight committees; still others engaged in public relations and marketing for the top life sciences companies. Katherine’s closest friend in graduate school, Ayodele Amana from Lagos, Nigeria, decided to spend her life getting the advances in life sciences all the way to the poorest migrants in Africa, who badly needed AIDS vaccinations and even the simplest of vitamins and water purification techniques that were commonplace in America generations ago. With the network of people Ayodele had met in graduate school, she was able to start a nonprofit that was largely funded by the larger bio corporations in the States. The disparity between the rich and poor was still growing, but through the work of people like Ayodele the rate was slowing.
The time 9:45 p.m. flashed in Katherine’s mind. If she was going to get up early the next morning to get some lab time in before handling the press conference, she would have to get to sleep soon. So she set her
alarm for 6:00 a.m., got in bed, told her steamer what outfit she wanted to wear the next day, turned off all the lights in her house, and then laid back and closed her eyes. “After all the life sciences advances of the past 15 years,” Katherine thought to herself, “why is it that we haven’t figured out an alternative to sleeping?”
THE NATURAL WORLD INTERRUPTS THE TECHNOLOGY CYCLE
In the beginning, at least, Johannesburg proved little different than Rio—high hopes but a disappointing level of global action. In the absence of binding agreements or catalyzing events, the world by 2010 was a natural projection of what existed at the start of the 21st century. There were, to be sure, another 400 million people, and significant increases in cheap computing power had occurred. In “have” countries, that played out into increased visualization and computational abilities that incrementally pushed medicine, engineering, science, and business. Meanwhile, ethical storms over biological manipulation and biomechanical hybridization held the public’s real attention.
In the other three-fourths of the world, survival remained the dominant imperative. By 2010, developing nations experienced premature deaths of nearly 3 million citizens per year due to inadequate water, food distribution, and power infrastructures. As in the previous decades, unless those deaths fell close together geographically and temporally so that the media could effectively cover a “disaster,” little happened. The 400,000 who starved in the Somali drought in 2005 did elicit food aid for East Africa. But sadly, when three times that number
died from contaminated water the next year in all of Africa, little happened.
By 2020 advances in many fields held hope for the now 7.1 billion occupants of earth. Genetically altered food had, for better or worse, been accepted in Europe and Africa and starvation was less frequent. Major advances in conservation had taken place in some parts of the world, while others burned their energy with abandon. Petroleum was still the key to transportation and major power plants continued to dominate electricity distribution, though the first real economic inroads of alternative energy sources took hold late in the decade. The predicted ecological collapse had not happened yet. Doomsayers saw dirty water, air, and soil just around the corner, while others declared that earth’s systems were far too robust to falter simply because virtually the whole planet was now farmed, lived on, or otherwise used for human purposes.
The world also finally stepped up to the titanic problem of providing clean water for every person every day. The cost was not low, both in disruption and dollars. Water was used in greater harmony with the climate; rice was no longer grown in deserts. And through a concerted effort, low-cost solutions for providing clean water in the poorest regions were paid for and implemented by the G8 (with the help and cooperation of all nations). This effort surpassed all of the more glamorous advances in biotechnology for preserving human life. While breathtaking (and much breath was spent debating their morality), the benefits of cloning and nano-biotechnology were available only in the wealthy portions of the world.
Natural disasters continued at an even pace. AIDS predictably killed 150 million people by 2020, but the epidemic dramatically slowed, as the pool of people infected got smaller. Earthquake abatement strategies continued to be enforced in Japan and California: new buildings were built to code, older buildings were retooled. Hurricane evacuation routes along the Gulf and Florida coasts, along with an increase in the ability to forecast hurricanes, limited human death but not property destruction.
Politically these were turbulent times. Local wars and uprisings seemed endemic. While the Afghan conflict started the century, other hot spots in Asia, Africa, and South America seemed to flare up two or three at a time. As one area found peace or at least a tense calm, it seemed two more wars sprang up. Nothing ever escalated to engulf an entire region—though the 2012 Korean affair easily might have, given the posturing of China and the United States. And the predicted global
chaos from a major cyberterrorism attack on the economic system never occurred—though not for lack of trying. Fortunately, so far the security people have stayed just far enough ahead of the terrorists, and the hits that have occurred were best described as disruptive but not devastating.
It was a dark and stormy night in Miami. Hurricane Stephan, the fifth category 5 storm to lash Florida in 2011 and the twelfth hurricane of the year, was holding sway on October 10. Yet the damage would not be what it might have been. The reasons are complex, but probably two tell the story best. One is the increased power of computers, and the other was the 2004 United Nations conference on disaster prevention, generally known as the “disaster conference,” the recommendations of which Floridians (and residents of Tokyo, Singapore, and most disaster-plagued areas) are happy to have heeded. The conference was the result of insurance company pressures combined with increased abilities to predict severe events and fairly solid predictions of increased natural disasters.
Essentially, the world representatives decided the chances of an increase in the number, severity, and points of landfall for hurricanes were high enough to take preventive actions. Response was primarily of a planning and education nature until a significant improvement in both climate models and cutting-edge computational power combined to give a (relatively) convincing prediction of an oncoming era of many very large hurricanes. At that point, serious money was spent to prepare. By the demonic Atlantic hurricane season of 2011, major hardening of both homes and public infrastructure was a reality. While there is little that can be done about tornadoes imbedded in 125-mph baseline winds, the rise of distributed power in hurricane areas did much to alleviate the scope of power loss. Major public buildings (hospitals, etc.) were upgraded to withstand those winds. Localities equipped community centers with steady supplies of food and water due to regular, if temporary, evacuations. More folks bought generators to share or to use at home. Residents moved out of the lowest areas when it became clear they would be wiped out by storm surges nearly every year and when faced with financial incentives, both positive and negative, that is, the unavailability of low-cost loans to rebuild and tax incentives to move. The net result was a big increase in retirement communities in Arizona
and a corresponding decline in Florida. Tourism faded some in Florida, especially from June through December. The state offered tax incentives for low-cost industries to relocate inland.
The “disaster conference” seems to have missed tsunami coverage. Looking back on the 2004 world conference, the one major “mistake” was downgrading to less important efforts to protect against tidal waves. At the time the chances seemed too low and the cost too high. And the asteroid striking the Pacific Ocean in 2017 was certainly a surprise. It didn’t fit the “insurance” model; 1.3 million died in the Puget Sound region alone, and the economic cost was in the multiple trillions of dollars in the United States alone (Tokyo did not suffer a direct tidal hit, a blessing that saved 10 million lives).
The almost total destruction of the heart of two major corporations combined with public panic caused the biggest stock market plunge in U.S. history, and, until industry worked out the effects, the whole country suffered. The loss of the ports of Vancouver and Seattle crippled southwestern Canada and the northwestern United States. The grain crop of eastern Washington was partly lost since it could not be shipped. Logging virtually ceased because product could not be shipped. The loss of inland jobs was a surprise addition to the “wave” of depression that followed the wave of water. Virtually everyone in the surrounding 200 miles lost a friend or family member. Psychiatric services were swamped. Suicide levels tripled in Washington.
The collapse of Microsoft and Boeing, combined with the almost total collapse of the insurance industry, plunged financial markets worldwide into a tailspin. Almost overnight, investors opted out of the market—until they could figure out where to put their money, they sat on it. The northwest power grid went down, taking parts of California and Oregon with it. The country did respond: rolling brownouts across the West shared the burden but hurt manufacturing everywhere. Fearing the future, people stopped buying all but the essentials. They didn’t leave their homes as often.
The public health system in the Seattle area was a disaster. Beyond the immediate deaths, hundreds of thousands died due to their inability to get treatment or even prescription medicines. Epidemics of cholera and other waterborne diseases were rampant. There was little fresh water, other than what was provided in disaster relief.
Fires from the failure of fuel pipelines, large and small, caused destruction of many buildings that actually survived the wave. To help
in the disaster and aid recovery, much of the U.S. military was deployed to the area. Soldiers were the construction workers, road builders, electric power restorers, and food distributors. This use of military personnel immediately reduced the ability of the United States to send forces into the rest of the world and thus reduced the country’s ability to fight a foreign threat.
In the ensuing years a great deal of effort went into determining areas of historic vulnerability, methods to passively disrupt Tsunami, and relocation of the most essential institutions. Tidal waves now carry the urgency of their fraternal twins—earthquakes. Seattle is now the model city in this planning. Most property below what is known now as the “high water mark” is devoted to parks and “expendable” uses. All essential services are routed underground. The innovations are innumerable. In the end it was cascading effects that determined the scale of devastation and convinced the world that the cost of preparing for tidal waves was not too high.
Other world events paled in comparison to Seattle’s tsunami. In general, the trends of the decade continued. There were no major volcanic eruptions in populated areas. Until Seattle, proposed efforts to protect the few U.S. cities at threat for natural disasters were a constant target for budget trimming.
GLOBAL CONFLICT OR GLOBALIZATION?
The Washington bomb went off in 2013. Members of the administration and other world leaders died instantly, reawakening the U.S. populace to the realization that the lull in terrorism in the mainland United States since September 2001 was illusory. The driving factors were little changed over the intervening 12+ years. In the West, economic and societal disparities grew extreme, while in the East pressures from population explosions and religious fundamentalism continued unabated.
The bombardment of the World Trade Center towers in 2001 opened eyes to the increasing technological disparities between the nations. An immediate military response removed that particular group of terrorists, but others saw the attack as a major success and made plans of their own. The ensuing world action taken against Iraq created illusions of peace and prosperity for the next few years, but while economic development returned to breakneck pace, the influences that precipitated the first attack only worsened.
New worries began in 2004 when Pakistan’s government was taken over by a group of religious fundamentalists, whose agenda included shaking off the effects of American cultural imperialism. Political and religious alliances in the Middle East sprouted, making it clear that any military action would result in a conflagration of the whole region, thus preventing outside involvement. Tensions grew in the Kashmir region, and a nuclear skirmish was narrowly averted thanks to surprise intervention by the Chinese, where the communist party remains in power to this day. Small-scale terrorism began around the world, and even though many of the attacks were attributed to the new extremist regimes, no efforts were taken to displace them.
After the horrific terrorist attacks at the Olympics in 2008, foreign vacation travel all but ceased, and governments started tightening their borders to prevent future attacks. These protectionist tactics crippled the world economy, reducing trade and ending economic cooperation. The United States saw equal gains in defense spending and its homeless populations; homeland defense and military spending reached half of the receding gross domestic product in 2010. To make matters worse, new restrictions were placed on visas for foreign graduate students and on H-1B visas, bringing university science and engineering research programs to a low ebb and strangling small businesses that could not afford to compete for scarce technical personnel.
To some, terrorism and the military actions to combat it were linked with the availability of technology; thus, in 2020 they found themselves fearing technology and those who wielded it as much as they once loved it. For them, economic, political, and religious beliefs have eclipsed a belief in the value of technology, taking away both the demand and the desire to engineer new devices.
Despite efforts worldwide to close borders, a rise in terrorism appeared in the major economic powers—even Japan and Switzerland saw increased activities, driven as much by a desire to attack sources of technology as by a desire to attack the nation itself. Feelings of nationalism reduced international cooperation. Stuck in their own downward economic spiral, many countries eliminated their aid to Africa and South America. Heightened security led to groups curtailing all efforts to distribute better technologies to third world nations, leading to rampant problems with sanitation and water quality. These events sparked more desperate measures to express anger over the divide between affluent and impoverished societies.
In the face of terrorist threats, the developed world moved to isolate the developing world to the degree that it could. Among traditional trading partners, it remained business as usual. While major international companies had long moved manufacturing and technical service jobs overseas in search of low-cost suppliers, the pressure to outsource “creative” jobs, such as engineering design, mounted. This placed severe downward pressure on the availability of engineering jobs in the U.S. economy. Further factors helping to displace jobs for new engineers were the productivity enhancements made possible by computer-aided design and software engineering. A new bimodal class system of engineers emerged: an elite few charged with controlling software improvements and a lesser class of technicians who executed the standard programs and implemented the results. Engineers found few jobs outside the defense industry, and only the major engineering universities survived the downturn in enrollments.
Dangerous concentrations of disaffected young people existed in many developing countries, driven by modest improvements in health care and high birth rates. Those countries with more enlightened leadership recognized that education is a solution and that technology is the engine of growth and thus moved to create or enhance their systems of technical education. The cadres of scientists and engineers created were available for outsourcing of technical jobs from the United States while their indigenous industries were being created.