Making Our School Buses Safer
Charley V. Wootan
School is opening this week and millions of children will board buses without safety belts. That may come as a shock to many parents who have diligently required seat belt use in their own cars. Now they must allow their children to ride unrestrained in a much larger vehicle.
It doesn't seem to make sense. Here our schools are determined to protect our kids from everything from AIDS to asbestos, yet most school buses in the United States lack safety belts. Wouldn't it be a better use of our resources to buy and install them?
Surprisingly, it might not.
I recently chaired a committee of the National Research Council that studied this question in detail, and we discovered—contrary to what one might expect—that there are more effective ways to protect young bus riders than with seat belts. In fact, we concluded that the overall potential benefit of requiring seat belts in school buses is insufficient to justify any new federal standard, although states and local school systems might want to install them on their own initiative.
In an average year, 10 children die while riding in large school buses—which, unlike smaller buses, generally do not have seat belts. By contrast, nearly 40 children are killed annually while trying to board or leave a bus. About two-thirds of those 40 are struck by a school bus, usually their own.
Such a relatively low death rate is impressive when one considers that the nation's 390,000 school buses travel nearly 4 billion miles annually. Riding in a school bus is four times safer than riding in a passenger car.
Still, 40 or 50 deaths of children cause acute suffering for the families involved, and we ought to reduce this toll if we can. Our committee, which included experts in highway safety, pediatrics, bus manufacture, occupant-restraint systems and other fields, calculated that equipping all school buses with seat belts would save one life and avoid several dozen serious injuries each year. The cost would be more than $40 million per year.
Alternatively, it would cost only $6 million annually to raise the height of seat backs from the standard 20 inches to 24 inches. This would save up to two to three lives and up to 95 injuries a year.
In terms of both cost and lives saved, in other words, raising seat backs is probably more effective than installing seat belts. It also avoids the problem of requiring drivers to keep their eyes on the road while ensuring that all the chil-
dren actually use the belts. Any parent who has tried to do the same with lively children in the back seat knows this is easier said than done. It is reasonable, however, to expect bus drivers to prohibit their passengers from standing in the aisles while the bus is in motion. More states should require this, which means ensuring enough seats for everyone.
More generally, states and school systems ought to focus on where most fatalities actually occur, which is along roadsides and in school loading zones. Too many children are killed when they dart in front of buses or play near them. Many of these deaths could be prevented with better safety training for both children and drivers and by installing stop-signal arms and improved cross-view mirrors on buses. Other possible techniques include loudspeakers to warn children, crossing-control arms to guide them as they cross the street, and electronic or mechanical sensors to help bus drivers detect children in blind spots outside the buses. All these warrant further study.
One of the best ways to improve safety is by removing from operation as quickly as possible buses manufactured before April 1, 1977. These older buses are not as strong structurally and have less padding on the seats, lower seat backs and less "compartmentalization" to contain children in the event of an accident. They also are more prone to post-crash fires, as occurred in the terrible 1988 crash of an older bus in Carrollton, Ky., which killed 27 occupants.
Raising the height of seat backs, prohibiting standees, improving bus driver and student education, testing new safety devices and getting old buses off the road all would reduce deaths and injuries significantly without burdening already strained school budgets. Seat belts are valuable, too. Yet, to save as many children as possible, we ought to concentrate our efforts where they will do the most good.
September 3, 1989
Charley V. Wootan is director of the Texas Transportation Institute at Texas A&M University.
* * *
A Computer Future Without a Heart
Samuel H. Fuller and Damian M. Saccocio
Like the Tin Man in "The Wizard of Oz," the United States faces a future without a heart. For all the euphoria we Americans may feel about recent political changes around the world, we are threatened by a less-publicized economic threat to the heart of one of our most vital industries and, thus, to our collective prosperity.
The industry is computers, and our country's position in it is far more precarious than most Americans realize. After all, the computers and software that most people use in their offices or homes are made by U.S.-based companies. It certainly appears that, unlike the situation with video cassette recorders or automobiles, computers are one industry where our country remains on top.
But appearances are deceptive. The U.S. computer industry is actually a collection of several connected industries—from component manufacturers to software designers. Together with telecommunications, they account for a tenth of our country's gross national product.
All of these diverse enterprises depend on the microchip, the incredibly small and powerful device that is the fundamental building block for all computers. Microchips make possible the word processors, computerized banking, video games and many other innovations that have revolutionized our lives.
A National Research Council report released this past week suggests that the microchip heart of our computer future is deteriorating, with potentially grave consequences not only for the industry but for U.S. society generally. Production of microchips and semiconductor equipment was pioneered in the United States. Yet, during the past five years, one U.S. company after another has withdrawn from the marketplace in the face of intense foreign competition, particularly from Japan.
As happened with many consumer electronic goods,
Americans excelled at developing the first microchips, but Asian firms came on strong in producing them in large quantities efficiently and with high quality. Asian manufacturers now produce over 90 percent of the world's dynamic random access memory (DRAM) chips. Unlike most U.S. firms, they endured a glut of these essential chips in the mid-1980s and are now reaping the profits from a $10 billion market.
If current trends continue, U.S. computer hardware manufacturers may find themselves limited to specialty markets, which is roughly comparable to our automobile industry agreeing 50 years ago to produce only jeeps and convertibles. This new situation threatens a loss of jobs and profits not only for U.S. chip producers themselves, but also for companies in related industries. Asian firms already have claimed about 45 percent of the U.S. market for personal computers and, together with European firms, are making inroads in the software arena.
Of course, cheap personal computers and other products are a boon to American consumers, but not at the cost of our semiconductor industry. Our national future depends on computers and information systems, and we simply cannot write off our core industry in this field. It is essential that U.S. chip manufacturers fight back by becoming more global and strategic, pursuing new technology and opportunities worldwide. They must not only invest in long-term research, but also do a better job of making the less-glamorous incremental improvements that often spell the difference between success and failure.
They cannot compete alone; they need the assistance of other U.S. companies and researchers, as well as of government. Our traditional cult of the solo entrepreneur has limited applicability to this task, given how closely foreign megafirms work with their governments and bankers. In fact, cooperation among our own companies, universities and government has contributed to many of our country's successes in technologies ranging from computer networks to parallel computing. We need more such collaboration.
Our computer industry is now a justifiable source of national pride, but a failure to recognize its precarious position
risks taking us down a path already traveled by U.S. manufacturers of automobiles, cameras, steel and other products. The challenge posed by foreign companies and governments is formidable, and we must become more strategic and decisive in meeting it. We cannot allow our heart to just wither away.
January 28, 1990
Samuel H. Fuller, vice president of research at Digital Equipment Corp., chaired a colloquium on the future of the U.S. computer industry for the Computer Science and Technology Board of the National Research Council. Damian M. Saccocio is a staff officer with the board.
* * *
Toward Motoring Smart
Robert D. Ervin and Kan Chen
As millions of commuters well know, traffic in most metropolitan areas is straining the highway system to its capacity. If present trends continue unchecked it will get even worse. Fifteen-minute delays suffered by the typical commuter in 1988 may stretch to an hour by early in the next century. Congestion will prevail throughout the day. Accidents will become even more disruptive.
The traditional solution to traffic congestion has been to build more roads. But we are reaching the point where we can no longer build our way out of the congestion crisis. Land and money are too scarce and many communities oppose new highways. What is needed now is truly radical change, including a more imaginative application of technology to vehicles and the highway system.
For example: A common bottleneck on many highways is toll-collection booths. These could be eliminated with a system that assigns each vehicle a code number and then scans them as they pass by, like items at a supermarket. The system would send each vehicle owner a monthly bill.
Roadside transmitters could send messages to voice synthesizers inside vehicles, informing drivers about road conditions. Vehicles could be equipped with compact discs that provide electronic maps of any neighborhood in the country. They might have radar or laser systems to warn them of collisions with other vehicles. Technology built into the highway might regulate the position of all vehicles, controlling merging and exiting.
Such concepts of ''smart transportation" have received a fair amount of publicity lately, but too many of the stories fail to look beyond the "gee whiz" nature of the technology to how we actually put it in place.
We must learn from Europe and Japan, which have been exploring these options aggressively through joint public- and private-sector efforts. The most comprehensive European effort involves 20 automotive manufacturers and 70 research institutes from six countries. Non-European parties are excluded. Japan, likewise, has been going it alone in developing intelligent transportation concepts with an eye toward eventual commercial markets.
Research on "intelligent highways" in the United States has been much more modest and dispersed, although lately there have been some hopeful stirrings. Cooperative research programs have been established at the University of California, Berkeley; the University of Michigan; and Texas A&M University, College Station.
The three domestic U.S. automakers and various electronics firms are starting to participate in long-range research on intelligent highways. The U.S. Department of Transportation has provided some support.
Still, a much greater research effort is needed to address the many questions that remain unanswered. How, for example, will transportation planners handle the transition to this new system, when some vehicles are equipped with new features while most are not? From a legal standpoint, who
will shoulder the blame if an accident is caused by a glitch in the automated roadway: the driver, the highway department or the car maker?
Until now technological and social concerns like these have kept both the private and the public sectors in our country in a state of inaction. U.S. companies have shuddered to invest in technology whose horizon for new products is years away. Government transportation agencies have been unwilling to undertake research without assurances that it will solve real highway problems.
To escape this chicken-and-egg situation we need more than tinkering with conventional approaches; we need vision. The Big Three car companies, in particular, must provide leadership and resources to make the concept credible. Highway agencies, too, must take a longer-term perspective. The only way to develop this new paradigm of a vehicle-highway system is for vehicle designers and road builders to work together. Doing so will also require a commitment from political leaders and a strong federal role because the states are unable to conduct the research and field trials of these systems by themselves.
We have no alternatives as a nation but to proceed vigorously with a joint government-industry effort. The magnitude of the U.S. traffic problem requires that we innovate in dealing with our vehicle-highway operations, providing American industry with a setting in which to work with government in developing the high-tech option. Otherwise, the collaborations already under way overseas will enable Europe and Japan to claim this new marketplace uncontested. It is time we also steered into the future.
February 21, 1989
Robert D. Ervin is a research scientist at the Transportation Research Institute at the University of Michigan. Kan Chen is a professor of electrical engineering and computer science at the University of Michigan. This article is adapted from a longer version in Issues in Science and Technology.
* * *
Easing the Crunch at Our Airports
Joseph M. Sussman
The volume of air traffic is rapidly outpacing the capacity of our nation's airport system. The current level of 1.3 million passengers daily is expected to double by early in the next century.
That could mean trouble for anyone who flies.
The boom in air travel, spurred by deregulation and other factors, has strained our major airports. In 1987, 21 of them experienced at least 20,000 hours of airline flight delays. The need for additional and better utilized facilities is acute. Yet only two new airports have opened during the past 20 years, one near Dallas, the other at Fort Myers, Florida.
Just one major airport is now under construction, outside Denver. Possible new airports in Los Angeles, Austin and northwestern New Mexico will not be operational before 2000, if they are built at all. Airports under consideration in Atlanta, Chicago, Minneapolis-St. Paul, San Diego and St. Louis lie even further in the future.
In a report being released today, a committee of the National Research Council said the outlook is bleak unless something is done to accommodate the continuing growth of air travel. Air gridlock would be agonizing for travelers and would harm the economy.
Adding airport capacity through new runways, terminals and parking lots, as well as with entirely new airports, is one alternative. Yet complex factors involving noise, environmental damage, budget limitations and other problems that must be considered often prevent the addition of flights or the expansion of existing airports, much less the building of new ones. Given the slow pace of current expansion and the likelihood of further political opposition to new airports, we also must think creatively about other options.
One possibility is to shift some traffic to airports that are now underused, creating new ''airline hubs," as has been done by Delta at Salt Lake City and Orlando and by American at Raleigh-Durham and Nashville. This would ease the
burden on overloaded hub airports elsewhere and reduce delay throughout the system.
An even more innovative, although unproven, idea is to develop airports specifically designed as transfer points. More than half of the passengers who arrive at Chicago's O'Hare and Atlanta's Hartsfield, for example, are headed somewhere else. Why not have these passengers change planes at a less crowded location and reserve major metropolitan airports for people who are actually traveling to these cities?
Managing demand, through either administrative centralized management of the air network or such market measures as peak-hour pricing for airport access, can lead to better use of existing capacity—albeit not without controversy.
New technology also can help. Aircraft equipped with quieter engines may be able to operate less obtrusively, enabling airports to increase operations or to stay open longer without bothering nearby residents. Improvements in the air traffic control system will allow runways to be used more effectively. A new generation of widebody jets, meanwhile, may be able to carry between 700 and 1,000 passengers, reducing the number of flights needed to serve densely traveled routes.
New aircraft that can operate on very short runways or that take off and land vertically could replace conventional planes for shorter flights while requiring much less space at airports. This could free longer runways for large jets or allow air service at smaller satellite airports near major population centers.
Another possibility is to develop high-speed surface transportation to replace air service in heavy travel corridors. Passengers who now fly between, say, New York and Boston might be attracted to a ground-based system using high-speed rail or magnetic levitation, or to "smart" highways that allow smart vehicles to travel with computer assistance.
There is no single, simple answer. All these and other ideas require extensive study and testing and a much more vigorous research and development effort. The federal government, working in cooperation with state and local authorities and with the private sector, must move decisively to avoid saturation of airports and airways. Action is needed now to lay the foundation for a safe, convenient and affordable intercity
travel system to carry us into the next century. Our current system simply cannot stretch to twice its size without breaking.
October 14, 1990
Joseph M. Sussman, director of the Center for Transportation Studies at Massachusetts Institute of Technology, chaired a National Research Council committee that studied the capacity of U.S. airports.
* * *
Protecting Our Phones from Terrorism
John C. McDonald
A growing terrorist threat in the United States is as close as your nearest telephone. Although deregulation and the introduction of new technology have improved our national phone system in many ways, they also have left it more vulnerable than ever before to terrorism and other perils.
That might not appear to be as alarming as a terrorist attack on other targets, such as an airliner or a municipal water supply, but the situation threatens far more than Sunday chats with Grandma. Major disruption of telephone lines could prevent air traffic controllers from communicating, cut off police and fire services and wreak havoc on businesses that send large amounts of data over the lines. Our information society could be brought to a screeching halt in the blink of an eye—or the cutting of a cable.
A glimpse of this perilous future was provided a year ago when fire damaged a large telephone switching facility near Chicago. Tens of thousands of persons were without the emergency 911 service for many days. Some businesses suf-
fered large financial losses and O'Hare Airport shut down temporarily.
The U.S. phone system is increasingly vulnerable not only to terrorism and fires, but also to computer hackers, natural disasters and inadvertent damage from construction digging. Internationally, most of the phone calls that connect us with the rest of the world will soon be carried by a small number of fiber-optic cables whose location on the ocean floor is well-known. These, too, are possible targets for terrorists.
The situation has worsened following the breakup of AT&T, which operated a national center that was responsible for maintaining phone service during emergencies. Today, that responsibility has shifted to a government agency, the National Communications System. The NCS is professional and hard-working, but the task it faces is ever more difficult.
The NCS must deal with several companies instead of just one, and many of these companies—such as AT&T, MCI and Sprint—have different technological standards, equipment and rules. Instead of securing a single network, in other words, NCS has to cope with a "network of networks." In today's competitive environment, furthermore, these companies cannot easily invest in costly security measures for which there is no immediate payback. In the past, AT&T provided security and emergency measures as a normal cost of doing business.
Beyond these institutional problems is the double-edged sword of technology, which has made long-distance calls not only clearer and cheaper, but also more vulnerable. Advances in fiber-optics technology, for example, now make it theoretically possible for the equivalent of all the telephone calls at a given moment in the United States to fit into just one cable. Although lines of this capacity are not yet in service, the trend clearly is towards concentrating more and more calls onto fewer fibers.
Digital switching technology is concentrating the network in a similar fashion, routing far more calls than did previous devices. Over the next few years, the 19,000 central switching offices in the United States will give way to a much smaller number of offices with switches that are far more powerful. The computer software that lies at the heart of this emerging system, meanwhile, also has grown more concentrated.
To assure reliability, the National Security Council should address this growing vulnerability of the U.S. telephone system and consider steps to reduce it. For instance, security might be improved at critical network nodes and more diversity and redundancy might be built into the system. Crisis-management capabilities could be improved, such as by requiring companies to provide priority service to police, hospitals and other selected users during declared emergencies. More should be done to protect telephone software from hostile use.
As matters now stand, a few well-placed grenades could bring down large portions of our domestic long-distance networks, cutting off calls from Hartford to Honolulu. We need to protect the system, now, before that happens.
August 6, 1989
John C. McDonald, executive vice president for technology at Contel Corp., chaired a National Research Council committee that studied the growing vulnerability of the U.S. phone system.
* * *
The New Arsenal of Democracy
Robert B. Kurtz
It's hard to remember that before Iraq's forces rolled into Kuwait, Americans were debating how to spend the "peace dividend" produced by the end of the Cold War. Now those hopes seem naive. All of us have been reminded that the world remains a dangerous place.
Saddam Hussein has illustrated in the boldest colors that the United States cannot beat swords into plowshares without having a backup plan to convert them back into swords. Although we may not need the same weapons or troop levels we had during the height of the Cold War, we must retain the capacity to meet new crises.
This includes rapid deployment forces, fighter planes, ships and other military assets. But something else must be on our list as well, something less obvious but perhaps even more important: the industrial capability to produce weapons and supplies quickly.
That capability is diminishing at a disturbing rate. I chaired a National Research Council committee that examined this recently and we came away worried about how well, and how fast, the United States will be able to respond to conventional wars in the future.
In today's technological world, it will be much harder to gear up for an extended fight than it was during World War II, when "Rosie the Riveter" and millions of other Americans worked around the clock to produce the guns, tanks and other supplies for the nation's "arsenal of democracy." Today, Rosie might need a degree in computer science and the skills to work on an advanced fighter aircraft, not to mention the right equipment, materials and support system.
Many of these resources are increasingly uncertain in the United States. Suppose, for example, that a modern Rosie was asked to design a microchip for an aircraft guidance system. She might find it difficult, if not impossible, to locate a domestic company with the equipment to convert her design into an actual product. U.S. firms that produce such critical products as bearings, machine tools and computer components have declined severely in the face of intense international competition. U.S. companies also are battling in such fields as optics, sensors, ceramics and superconductors, all of which have potential military applications.
A single component produced by a U.S. firm can be essential to a complex weapon system. If the firm goes bust, the Pentagon must go elsewhere. If every other U.S. company also drops out of the market, as has occurred in many consumer electronic fields, the Pentagon must buy the compo-
nent overseas. That leaves the weapon system, and in turn our defense, vulnerable in a world in which we must expect the unexpected. Just imagine if we couldn't get our hands on the computer chips needed to run our planes, submarines and tanks.
U.S. defense contractors should be modernizing their plants to avoid this possibility. But, even before recent cutbacks within the industry, defense firms were investing inadequately in manufacturing technology. Many companies lack the sophisticated machinery needed to produce modern weapons. One reason is that the Defense Department provides little incentive to them to operate more efficiently. More attention is given to designing weapons than to ensuring that they can be manufactured.
Many companies also lack excess equipment for emergency defense production. In an era of corporate raiders and intense competition, they can no longer afford to keep equipment idle waiting to be revved up during a crisis. They are poorly prepared to cope with the inevitable bottlenecks that a sudden gear-up would cause.
The problem is not only within the companies, but at the Pentagon. The Defense Department is aware of the need for industrial preparedness, but its process for ensuring it is fragmented and inadequate. President Bush should create a national group to set clearer policies for industrial preparedness and Defense Secretary Cheney should elevate responsibility for the issue to a higher level. Other top planners also must give the problem more attention.
The gap between our military needs and our industrial capability is widening and in danger of becoming unbridgeable. Beyond the horizon of the Persian Gulf, this eroding capacity to fight conventional wars poses a real threat to our security. There will always be another Saddam, Khaddafi or Noriega, and we need to be prepared for them.
August 26, 1990
Robert B. Kurtz is a retired senior vice president of the General Electric Co.
* * *
Building Houses People Can Afford
Why is housing so expensive when televisions, computers and many other goods cost relatively less now than they did before?
Between 1970 and 1987, the consumer price index tripled, but the price of clothing and telephone services only doubled. The median sales price of a privately owned one-family house, on the other hand, jumped from $23,400 to $104,500—a breathtaking increase of 447 percent.
Breathtaking and, for people trying to purchase their first home, heart-stopping. Only 20 percent of Americans now earn enough to purchase a new house at market rates without a trade-in, a dramatic drop from 50 percent two decades ago. For millions of people, the dream of owning a home has faded.
The main difference between houses and televisions, of course, is that houses require land, which is in fixed supply with rapidly escalating costs. Housing prices also are affected by interest rates, local business conditions and other factors that are hard to ameliorate.
But one factor that can and should be changed is the outdated way we build houses. Modern building techniques could reduce the cost of a new home from, say, $100,000 to $90,000, or even less. That is not a huge difference, but every dollar counts, particularly when one computes interest costs over the life of a mortgage.
Most builders in our country now produce houses one by one with conventional materials instead of taking advantage of mass production techniques and newer technology. They install bathrooms one fixture at a time rather than using prefabricated units with the lights, toilet, sink and tub already in place. They do the same for kitchens and make inadequate use of breakthroughs in composite materials, microelectronics and robotics.
The lowly two-by-four remains the primary construction material, even though a growing demand for wood products has caused it to become scarcer and more expensive. Few
American home builders have thought seriously about replacing two-by-fours, a sharp contrast with the situation overseas, where many builders are experimenting actively with alternative materials and systems.
The failure of the construction industry to innovate threatens its own future in the same way that technological complacency hurt U.S. automobile and steel manufacturers. In some states, segments of the construction industry are now dominated by foreign companies.
For frustrated home buyers who lack the money even for modest ''starter'' homes, the situation is already critical. It will probably get worse so long as housing follows the characteristics of a service industry rather than a manufacturing industry. The aging of the baby boom generation and other trends may provide some relief, but low productivity will keep many Americans in rental units instead of their own homes. Those at the bottom of the economic ladder, in particular, will face rising rents and fewer viable options.
Home builders are not inherently averse to new technology. However, there now is little incentive within the industry to invest in technological innovations. Developing new technologies is expensive, requiring not only basic research but also material testing, construction of prototypes, code approval, tooling for production and marketing of the final products. Any one of these activities may take several years.
As things stand, the would-be innovator has no way of knowing what interest rates, the money supply and other conditions essential to success in the housing market will be like when the product is finally ready. As a result, over the past 15 years the building industry has tended to make minor changes to existing products rather than invest in true innovations.
For the sake of millions of would-be home buyers, this needs to change. One of the best ways the industry could become more innovative is through new public and private programs that spur fresh concepts and new products. Test beds should be established to try out appropriate ideas, facilitate testing and speed regulatory approval of innovations. To succeed, experimental programs would need to protect prototype designs and a limited number of housing units from frivolous lawsuits, and to disseminate their results widely.
Instead of wringing our hands endlessly about housing costs, it's time we tackled each of the components of that cost and, with respect to technology, became more creative about supporting research and development. Americans need houses they can afford.
December 31, 1989
Ezra Ehrenkrantz is president of Ehrenkrantz, Eckstut and Whitelaw, an architectural firm in New York. He holds the Chair of Architecture and Building Science at the New Jersey Institute of Technology and is a member of the Building Research Board of the National Research Council.
* * *
Designing for an Aging America
Sara J. Czaja
Despite the fact that the elderly are the fastest-growing segment of the U.S. population, everyday environments often make life difficult for them. For example:
Labels on medicine bottles often are small and difficult to read.
Water faucets, door knobs, and lids on jars and other containers can be hard to open or close.
Kitchen counters and cabinets sometimes are too high to reach, especially for people with arthritis.
Appliances often are difficult to operate.
Most of these and hundreds of similar problems are nuisances, but some are life-threatening. l recently co-chaired a National Research Council committee that found that many of the problems could be prevented with forethought and better design. The four examples cited above, for instance,
might be remedied with larger medicine labels, new faucets, lower kitchen counters and less-complex appliances.
During the 1990s, the number of people in the United States aged 55 and over will increase by 11.5 percent, a gain of more than 6 million persons. The growth rate among those aged 75 and older will be 26.2 percent, or a gain of nearly 4.5 million people.
Many of our homes, workplaces and highways were designed for a younger population. This is problematic; people aged 65 and over account for approximately 43 percent of all home fatalities. Many need help carrying out routine activities. Yet, in general, their special needs have received remarkably little study. Research is lacking on how declines in physical and mental agility that occur with aging affect the ability to carry out routine activities.
For example, while it is known that the strength of a person's grip tends to decline with age, the implications for the design of kitchen appliances have been studied insufficiently. Similarly, we do not understand how declines in vision, memory and reaction time affect one's ability to drive an automobile.
This is not to suggest that the needs of the elderly have been ignored; many companies produce excellent products for older customers. Neither is aging synonymous with frailty or illness. Many older Americans enjoy healthy, productive lives. But increased research on the "human factors" needs of the elderly would be of enormous value. Wherever possible, the goal should be to create designs not specifically for the elderly, but for the entire population. Safer bathtubs or stairs, for example, would benefit everyone.
High priority should be given to living environments. Approximately 13 percent of older people who live at home exhibit at least one major decline in physical mobility. Elderly people also are more likely to live in older homes that need maintenance and repair. Many end up in nursing homes for lack of relatively simple changes in their homes. That is a tragedy for those involved and costly for society.
Similarly, an unknown number of older workers retire every year because their companies are inflexible in meeting their specific needs. Older employees may require extra lighting
or a bit more time to perform certain tasks, but they more than compensate in other ways, such as with their greater experience. More research also should be done by transportation planners on everything from how to redesign roadways to providing easier baggage handling in airports.
Technology can solve many, although not all, problems. Scalding accidents in bathrooms are common among the elderly, for example, and they could be reduced by redesigning water controls, lowering water pressure or regulating the temperature. New remote monitoring devices can track the vital signs of elderly people or the movements of Alzheimer's patients who may wander.
One inspiring application of technology is occurring in the Miami area, where a group of 38 elderly women are using home computers to send each other messages and view the latest news, weather and movie reviews. The women, the oldest of whom is 95, already have used the system to produce a cookbook. They now are anxious to learn new applications, such as banking or shopping from home.
Society needs more innovations like this for its homes, offices, roads and shopping centers. After all, none of us is getting younger. Our "built environment" ought to age gracefully along with us to meet our changing needs.
September 23, 1990
Sara J. Czaja is research director at the Stein Gerontological Institute in Miami and associate professor of industrial engineering at the University of Miami.
* * *
Preparing for the Next Big Natural Disaster
Richard E. Hallgren
The dust has settled from the Bay Area earthquake, but what about the next big quake? What if it strikes not in California, which is relatively well-prepared, but in some other part of the country whose susceptibility to earthquakes is less well-known?
Charleston, for example, is still digging out from Hurricane Hugo, but a major earthquake occurred there in 1886, killing 60 persons. Missouri has a high probability of experiencing an earthquake registering six points or greater on the Richter scale within the near future, and 80 percent of the building stock in St. Louis is unreinforced masonry, which could crumble in a quake.
In fact, a panel of the National Research Council cautioned earlier this year that Boston, Seattle, Memphis and other urban areas in at least 39 states face significant risk.
The contrast between what occurred in the Bay Area and the earthquake of similar magnitude in Soviet Armenia last year illustrates the danger that many of these other U.S. cities face. Most would experience far fewer than the estimated 25,000 deaths that occurred in Armenia, because building codes and construction techniques are different from those in the Soviet Union. Nevertheless, without adequate preparation, the devastation from a major quake in many U.S cities could be severe.
The problem is not only earthquakes, but natural disasters generally. Hurricane Hugo, Hurricane Jerry and the Bay Area earthquake followed on the heels of wildfires in the West, flash floods and debris flows in Hawaii and an earthquake in southern California during the past two years. Nine hundred tornadoes strike our country each year, and more than $1 billion in damage is caused annually by landslides.
Although we cannot prevent these disasters from occurring, we could be doing much more as a nation to prepare for them and minimize their impact on life and property.
A group of leading scientists, engineers and others reported recently to the Research Council that the United States is failing to take full advantage of a growing body of knowledge about hazards. Our country's current approach can be characterized as a patchwork of temporary fixes, incomplete analyses of alternatives and uncoordinated actions and policies. In engineering designs, investment decisions, and public and private policymaking, many efforts—such as building on landfill in San Francisco or on barrier islands in South Carolina in the face of known hazards—increase potential losses rather than reduce them. Proven technologies are not applied, whether out of a false sense of economy or inertia. The United States also spends much less than many other countries on research and development in this field.
Certainly, many improvements have been implemented. One look at the survival of San Francisco's skyline or at the early warning system in Charleston shows that substantial progress has been made and that many dedicated Americans are working hard to save lives. Yet the same two disasters also revealed that much more remains to be accomplished.
Ironically, the earthquake and hurricane struck at a time when the United States has a unique opportunity to expand its efforts in hazard mitigation, both at home and internationally. This past week, the United Nations acted to implement a worldwide program that designates the 1990s as the "International Decade for Natural Disaster Reduction." The program will encourage scientists, engineers, public officials, urban planners and others from around the world to share research and information on hazard-reduction techniques.
For Americans, this need not require losing perspective about the tradeoffs involved in cost, land use, convenience and the like. Much can be gained from relatively simple measures, such as encouraging homeowners in storm-prone regions to secure their roofs, controlling erosion to prevent landslides and preparing better relief plans. More expensive measures, such as reinforcing older highway structures like the one that collapsed in Oakland, may also be worth the cost.
The most important need, however, is to overcome complacency about the situation. Both Hurricane Hugo and the
California earthquake show that Americans must not be fatalistic about disasters. We must grab the historic opportunity before us to make these events much less disastrous.
November 2, 1989
Richard E. Hallgren, executive director of the American Meteorological Society, chairs the U.S. National Committee for the Decade for Natural Disaster Reduction.
* * *
Our $1.5 Trillion Investment
Robert F. Jortberg
Imagine a national asset whose value dwarfs the cost of the Clean Air Bill, child care, fighting drugs and other initiatives combined.
Such an asset exists, but it has the misfortune of falling under the heading of "infrastructure." I would wager that most people who read that word immediately think of bridges or potholes. Either that, or they turn to the comics page.
Infrastructure involves more than transportation, however, and it is not mundane. A significant piece of it is as close to our daily lives as our neighborhood school, hospital and post office. These and other public buildings are worth approximately $1.5 trillion. Many of them are quietly falling apart.
The parking garage atop the New Haven Coliseum corroded so much that it now must be demolished and rebuilt after less than 20 years. One urban school district has a maintenance budget so low that its classrooms will be painted only once every 100 years. Buildings at hospitals, universities and other public institutions are saddled with everything from leaky roofs to severe structural problems.
A family severely in debt would not allow its car to oper-
ate without oil because it cannot afford a new one. Our country is in a similar position with its severe budget deficit, yet it is failing to maintain and repair its buildings. Department of Defense buildings alone are worth more than $500 billion, and the nation's 88,021 public school buildings are close behind.
I chaired a National Research Council committee that recently examined this situation, and we were troubled by what we found. Many officials are failing to protect public buildings, and the potential costs of correcting the situation already total billions of dollars. Poorly maintained buildings not only threaten the health and safety of occupants; they also are terribly demoralizing for the teachers and others who must work amid broken lights and heating systems.
The problem is not that officials deliberately sabotage schools, prisons, fire stations, recreation centers and other buildings. Most of them are hard-working people trying to serve a public that wants more services without new taxes. Yet when faced with a choice between maintenance or providing more visible services such as snow removal or police protection, many officials choose the latter.
Similarly, it is inevitable that politicians will reap more glory for breaking ground at a new building than for modernizing the ventilation system at an existing one. Only if that ventilation system becomes a breeding ground for Legionnaire's disease, or if some other calamity occurs, will they suffer. More likely, the consequences will become apparent only after the politician leaves office.
These failings are understandable, but the fact remains that public officials are supposed to be stewards of public assets. In cases where political expediency motivates the decision, neglect of building maintenance is nothing less than a squandering of public funds.
What can be done to improve the situation? The answer does not lie in castigating public officials but in reforming the process that leads them to make the decisions we now see on the federal, state and local levels. Our committee offered two specific suggestions.
First, officials at all levels of government need more and better information. Too many city managers and state pub-
lic works directors are trying to allocate funds without a clear understanding of what's needed. At a minimum, they should assess all buildings regularly in a way that yields useful information. Techniques exist to accomplish this, and officials ought to start using them.
Second, budgets should reflect the fact that adequate maintenance and repair are essential parts of the overall cost of owning public buildings. An amount equal to about 2 percent to 4 percent of buildings' replacement value needs to be dedicated to this function each year. Officials should not be allowed to "borrow" these funds for other purposes.
The larger need, of course, is money. Increased spending on maintenance would do wonders and, in the long run, actually save funds by eliminating the need for many repair or replacement projects. These are our buildings, all $1.5 trillion worth of them. If we fail to protect them, we'll just end up spending more to rebuild them.
July 15, 1990
Robert F. Jortberg is a retired Navy rear admiral.
* * *
Tough Choices About Rising Sea Level
Robert G. Dean
The world's sea level is slowly rising. Does this mean you should avoid investing in a beach condominium? Should coastal communities be planning to retreat from the oceans? Should they, instead, try holding off the waters with dikes, storm gates and pumps? Or is all this a lot of paranoia over nothing?
One thing is sure: Rising sea level and coastal deterioration are on the minds of a lot of Americans. Time magazine recently published a cover story entitled "Shrinking Shores" that was filled with grim tales of families in coastal states watching their dream homes sink into the ocean. Communities from Maine to Maui are worried about the problem.
An expert committee of the National Research Council, which I chaired, recently tried to set the facts straight about this growing issue. We released a comprehensive study of what sea level rise means for coastal planners and engineers. Our committee considered a range of possible increases in sea level over the next century—from about 20 inches to nearly five feet—and pondered their environmental and policy implications.
Our basic conclusion was that there is no reason for panic. Boston, Miami and San Francisco are not about to slip into the ocean. On the other hand, the evidence is convincing that sea level is rising and that many coastal communities lack the data they need to make realistic decisions about coping with the situation in the decades ahead.
Most scientists agree that increased levels of carbon dioxide in the atmosphere, caused in part by our modern appetite for fossil fuels and forest products, are creating a "greenhouse effect" that will raise world temperatures by three to eight degrees Fahrenheit over the next century. Scientists disagree, however, over the extent to which this will cause the world's glaciers to melt and ocean water to expand, bringing about a rise in sea level.
Changes in sea level are measured in two ways: the relative change between the land and the sea in a given location and the absolute change worldwide. During the past century, relative sea level has risen by about 12 inches along the Atlantic coast, six inches along the Gulf of Mexico and four inches along the Pacific coast. Much of this increase has been due to land subsidence, or sinking, caused by the extraction of underground water or hydrocarbons, by earthquakes and soil compaction, and by other factors. In the Louisiana delta, for example, these factors and wetlands destruction are responsible for the sea's moving tens of feet inland every year.
At the same time, relative sea level has been falling in Alaska and other far northern locations as glaciers melt and the land is relieved of their tremendous weight.
In general, most shoreline communities in the United States do need to start getting ready to deal with a higher sea level over the next century. They have three options: armor the shoreline, nourish it with dredged sand or retreat to higher ground.
None of the options is cheap. Armoring a shoreline with sea walls and other devices or nourishing it with dredged sand can cost as much as $800 per foot of shore front. In the case of beach resorts or port facilities, of course, the existing investment may be so great that these expenditures are warranted. Elsewhere, retreat may be the best choice. North Carolina, with its environmentally fragile barrier islands, for example, has already established an erosion buffer zone where new construction is prohibited. Other states are doing the same.
Fortunately, most beach cottages and other small buildings with short life expectancies are not threatened immediately by rising sea level—although the land on which they are built may eventually wash away. Similarly, most industrial facilities are renovated often enough that rising sea level can be accommodated. Here, too, it depends on local conditions and other factors.
A greater problem is the fate of coastal bridges, airport runways, power plants and other major long-term facilities. These expensive installations may have to be protected from rising waters or redesigned to handle the new conditions. Also of special concern are the intrusion of salt water into underground aquifers and the impact of rising sea level on coastlines that have special environmental, as opposed to economic, value.
Again, it is important to stress that global sea level is rising slowly enough to prepare for its effects. But the situation clearly demands increased vigilance and research rather than complacency. Specifically, much more intensive monitoring and research programs are needed to help coastal communities