Page 13

2—
Information Technology Trends Relevant to Crisis Management

In one session of the workshop, panelists were asked to project how trends in information technology research might affect crisis management. They discussed both the current state of technologies and future developments, and as part of their analysis identified broad areas of potential growth where new information technology research would have a significant impact. In the area of computing and storage, Paul Smith discussed trends in high-performance computing, including supercomputer speeds and very large storage devices. Barry Leiner discussed software issues in finding, integrating, and sharing the enormous amount of information available in current and future information networks. In the related area of databases, David Maier outlined trends in the development of database systems to support complex applications and data types. In the area of wireless communications, Phillip Karn gave an overview of the development of cellular, digital, and satellite communications devices for voice and packet data. Finally, Daniel Siewiorek described the rapid development of computers designed to be worn in the field and touched on what has been learned about how people interact with these devices.

Computing and Storage

Paul Smith, from the U.S. Department of Energy's (DOE's) office responsible for the safety, security, and reliability of the nation's nuclear weapons stockpile, discussed high-performance computing trends. These were recently explored through a series of workshops conducted by DOE



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 13
Page 13 2— Information Technology Trends Relevant to Crisis Management In one session of the workshop, panelists were asked to project how trends in information technology research might affect crisis management. They discussed both the current state of technologies and future developments, and as part of their analysis identified broad areas of potential growth where new information technology research would have a significant impact. In the area of computing and storage, Paul Smith discussed trends in high-performance computing, including supercomputer speeds and very large storage devices. Barry Leiner discussed software issues in finding, integrating, and sharing the enormous amount of information available in current and future information networks. In the related area of databases, David Maier outlined trends in the development of database systems to support complex applications and data types. In the area of wireless communications, Phillip Karn gave an overview of the development of cellular, digital, and satellite communications devices for voice and packet data. Finally, Daniel Siewiorek described the rapid development of computers designed to be worn in the field and touched on what has been learned about how people interact with these devices. Computing and Storage Paul Smith, from the U.S. Department of Energy's (DOE's) office responsible for the safety, security, and reliability of the nation's nuclear weapons stockpile, discussed high-performance computing trends. These were recently explored through a series of workshops conducted by DOE

OCR for page 13
Page 14 on data and visualization corridors—pathways through which scientific data can be exchanged and users can work collaboratively.1 Smith started by noting that visualizations based on large simulations, information in large scientific databases, and real-time observations, which depend on high-performance computing, are a valuable tool that would have many applications in crisis management. One component of the DOE-sponsored workshops was the development of a time line (1999 through 2004) for various computing performance parameters. For example, the computing speed of the fastest machines is expected to increase by a factor of more than 30 by 2004.2 The size of a typical data query (with a constant transfer time) is projected to grow from 30 terabytes at the upper limit now to 100 terabytes and even 1 petabyte (1 million gigabytes, the equivalent of 109 books or almost 2 million audio compact disks) directed to archives that are 100 petabytes in size. Different applications require different balances of cycle speed versus memory to achieve a specific result within a certain time. Systems must advance in a balanced manner. The balancing process also must take into account storage capabilities, speed of data access, and network speed. Research is under way on storage technology to try to reduce the cost and footprint of petabyte storage systems. In general, the development of storage technology is well financed, and the market is driving advances targeted at the low end, such as personal computers, as well as the high end, such as large-scale business and corporate systems. However, at the very high end (e.g., intelligence and space systems), special efforts will be required to achieve storage increases. According to Smith, advances in storage technologies may be impeded by problems with device reliability, the physical transparency of devices to end users, the security of network data, and resource control and management. 1See Paul H. Smith and John van Rosendale, eds. 1998. Data and Visualization Corridors: Report on the 1998 DVC Workshop Series. Technical Report, California Institute of Technology, Pasadena, California. 2Microprocessor performance has increased rapidly throughout the last decade and has become equivalent in many cases to that of large machines. This trend has enabled supercomputers to progress very rapidly, with sustained performance projected to reach 1 petaflop by 2007. A number of forces are driving these advances. One is the progress in electronics, particularly those to which Moore's law applies and the associated decrease in the feature size on microprocessors. Achieving this progress will require using different technologies. The industry currently relies on optical lithography, for example, but in a few years it is expected to have to convert to a different process, such as X-ray or electron beam lithography.

OCR for page 13
Page 15

TABLE 2.1 Trends in Capabilities for Information Management Capability Current Level Goal Federated repositories Tens (custom) Thousands (generic) Items per repository Thousands Millions Size of ''large" item 1 MB 100 MB Typical response times 10 s 100 ms Mode Play and display Correlate and manipulate Interoperability Syntactic Semantic Filters Bibliographic Contextual Language Multilingual Translingual Context and tags Forms and tags Semistructured SOURCE: Information Technology Office, Defense Advanced Research Projects Agency. 1999. Information Management Program Goals. Department of Defense, Washington, D.C. Available online at <http://www.darpa.mil/ito/research/im/goals.html>. In its high-performance computing program, the DOE is pursuing a strategy of leveraging commercial building blocks by combining many processors and linking together small storage devices of a size driven by the commercial marketplace to make larger or scalable systems. Challenges associated with this strategy include how to design the information management or data management software needed to exploit these capabilities. Information Management Barry Leiner of the Corporation for National Research Initiatives discussed software issues in finding, integrating, and sharing the enormous amount of information available in current and future information networks, as well as issues related to availability. These are issues that the Defense Advanced Research Projects Agency (DARPA) has been exploring and for which it has established a useful framework for thinking about trends in information management (Table 2.1). The following elements are necessary to provide information that enables workers to perform their jobs well, whether in a collaborative or individual setting:3 3The discussion here is adapted in part from Information Technology Office, Defense Advanced Research Projects Agency (DARPA). Information Management Program Goals. Information Technology Office, DARPA, Department of Defense, Washington, D.C. Available online at <http://www.darpa.mil/ito/research/im/goals.html>.

OCR for page 13
Page 16 • Robust infrastructure. A crisis can threaten the integrity and performance of critical information infrastructure. How can the infrastructure be better protected? How can it be designed to provide graceful degradation when under stress? • Information search and retrieval. Decision-making processes in all sectors rely on enormous amounts of information, which is continually being augmented and updated. How can users effectively query diverse information sources? How can they effectively manage the information they receive in order to support their activities? • Compatibility of formats. Shared information can be represented in a diverse range of formats, which vary according to the syntactic structure, extent of meaning captured in the representation (e.g., an HTML table vs. a table in a relational database), and nuances of meaning within categories (e.g., various bibliographic representations in use in libraries). How can diverse formats be reconciled and managed? • Building of knowledge-sharing organizations. The ready availability of electronic information reduces barriers to communication, information sharing, and collaboration. What are possible effects on how people and organizations carry out their business?4 The issue of infrastructure availability is a significant challenge in crisis management. One approach is to design systems that can degrade gracefully. Another is to design the system at a level that can be assured—which is what the Department of Defense (DOD) traditionally has done in building its own systems. A drawback of the latter approach is that DOD is less able to exploit advances in civilian technology. Neither approach seems optimal.5 One alternative being explored would selectively provide key information to areas that incur great damage. For example, when communications are degraded, only relatively recent information would be transmitted to affected areas, with "prepositioned" information resources providing the rest of the information needed. This 4This issue is one of those proposed in the Administration's Information Technology for the Twenty-First Century (IT2) initiative under the heading "Social, Economic, and Workforce Implications of Information Technology and Information Technology Development." 5Commenting on the need for collaboration in information management technology, Leiner cited the tremendous synergy between the civilian and military requirements for crisis management. Although, the military requirements are more stringent because of the need to be able to react anywhere, anytime, anyplace, there is civilian-sector technology that lends itself to meeting those requirements (e.g., laptops provide portable computing in the field, and commercial satellites provide suitable communications capabilities for many circumstances).

OCR for page 13
Page 17 approach requires knowing in advance what a user will require. Unlike the case with disasters such as hurricanes, for which officials have an advance idea of where the storm might hit and what information must be available, there are many other sorts of less predictable crises, that can occur anywhere in the world at any time. An approach that depends on a preplanned distribution strategy cannot meet the requirements of such contingencies. Beyond the challenge of taking the tremendous amount of information on the Web and making it accessible to crisis management teams when they need it is the goal of making this information, as well as knowledge representations, available in a way that supports collaboration by ad hoc teams assembled rapidly in a crisis. Databases David Maier of the Oregon Graduate Institute discussed several trends in the development of database systems to support more complex applications and data types. • Support for application logic. Databases are increasingly managing not only the data but also the application logic, which consists of instructions on how to manipulate the data. This trend began in the mid-1980s, when stored procedures and object databases began appearing on the market. Support for application logic then emerged in both database engines and affiliated tools. An example of the former is database engines storing multimedia types; an example of the latter is tools that convert data into HTML—the language used to represent Web pages—to support user interfaces. The trend toward integration of application logic was successful for several reasons. One reason is the ability to mask heterogeneity. In a large enterprise using many different types of machines, an application that can be written using only database services is more easily moved than one that depends on platform-specific services such as a file system. A second reason is manageability. Applications are changing rapidly and acquiring new functions, so help from a database system is useful. The database can help deploy, configure, and manage applications that use data and can help recover both the data and the application after something goes wrong. Finally, incorporating application logic into databases helps provide scalability in applications, which have become quite complicated, require access to distributed data, and must support large numbers of users. For example, transactions can be initiated with a store or airline without any human intermediary, and so the availability of sales representatives no longer limits the number of users that can access the database at once.

OCR for page 13
Page 18 • Data type extensibility. The ability to add additional data types provides a database with additional information about an application. Thus, rather than simply identifying an image representation inserted in a database as a large, untyped sequence of bits, the database understands the type of image and how it can be manipulated. The result is that the user can search and manipulate complex types directly in the database system rather than in the external application program, leading to a reduction in application complexity and improved consistency of the data in the database.6 • Data warehousing. Database developers are realizing that users want their products to provide support for executing complex decision support queries on the same systems that process online transactions, spanning multiple data sources. At one time, it was believed that relational databases would enable users to run complex decision support queries. But in fact, systems optimized for transaction throughput do not support efficient analytical queries, and vice versa. Today, because data can be duplicated for an affordable price, a separate copy of the data can be used in a database system organized for efficient support of decision support queries. In addition, many tools are available for moving operational data into a warehouse, extracting and cleaning them, and loading them in parallel. The warehouses hold much more data than do operational transaction-processing systems, often terabytes of information. Database languages and query processors have extensions for efficient data analysis. For example, they could analyze all sales for a large retailer such as Wal-Mart and display it by store, by department, and by quarter. Such tasks frequently involve analyzing hundreds of millions to billions of records.7 • Development of application servers. To support applications with many users, a middle tier is evolving between the database and desktop. This application server acts as an intermediary between clients and back-end databases. The client portion of an application might simply be a form in a Web browser that captures some information about what data and operations are needed. The application server determines what back- 6In an object database, extensions involve adding new classes of data. In relational databases, pluggable modules called extenders or cartridges are added. 7This approach does not necessarily apply to database management in crisis management applications. A well-managed company such as Wal-Mart can be in control of all its data and can make at least the formats consistent. The ad hoc composition that characterizes much of crisis management information processing is not centrally managed, due to the large number of independent organizations involved, and can present huge challenges for analysis.

OCR for page 13
Page 19 end database(s) to contact and performs the computationally expensive parts of operations. Maier said that the database companies are starting to figure out how their products can make this middle layer easier to construct and manage. A benefit of this approach is that, rather than trying to update 10,000 clients with a new application (including worrying about providing and controlling remote access to each), one could simply update 10 application servers with new logic. Amid all these advances, databases continue to have limitations. One is the disk-centric focus of current database system products. For example, some people still argue that a large enterprise should not deploy servers in all the locations where it conducts business but should instead have one large server to which each business site is connected. In other words, the focus is still on data storage rather than on data movement, which Maier pointed to as a key to the future of database technology. Database systems should involve data staging and movement, rather than just holding data in readiness for future queries. Another current limitation of databases is that they do not handle unexpected types of data well—a formal structure known as a schema must first be defined. That is, if a user uncovers some interesting information of a new type and wants to preserve it and its structure for manipulation and delivery later, the current generation of database system products generally cannot readily accommodate the new information. For database systems to expand in scope, this "schema first" requirement must be relaxed. Wireless Communications Philip Karn of Qualcomm discussed some past, current, and future trends in wireless communications, which have been driven by a combination of increased demands for end-to-end performance and the need to achieve greater efficiency in use of the finite radio spectrum.8 In the mid- to late 1970s, analog two-way radio systems were commonplace. Analog technology continues to be used in combination with sophisticated control systems and is the workhorse for two-way public service and emergency communications. Also at that time, DARPA began funding a substantial amount of research in packet radio. The concept was that packet radio networks could be dropped into remote areas 8For an extended discussion of the history of wireless communications development see Computer Science and Telecommunications Board (CSTB), National Research Council. 1997. The Evolution of Untethered Communications. National Academy Press, Washington, D.C.

OCR for page 13
Page 20 to fill gaps in existing systems. Much of that early research is now starting to bear fruit in operational systems, Karn said. In the early to mid-1980s, advanced mobile phone service (AMPS), which uses traditional analog voice modulation, was developed and deployed. The major innovation was its use of digital control channels, so that calls could be switched automatically from one cell site to another, allowing the user to treat an AMPS cellular telephone in much the same manner as a wireline telephone. In the late 1980s, demand for cellular telephone service increased. Qualcomm started trying to apply well-established spread-spectrum techniques to improve the efficiency of cellular telephony. In the early 1990s, the company launched tests of code division multiple access (CDMA), which is based on the spread-spectrum technologies used in the military. At that time there were a number of competing digital systems. Now in limited use in North America, Asia, and Eastern Europe, CDMA was first launched commercially in Hong Kong in 1995. Two schemes (GSM and IS-54) based on time division multiple access (TDMA) operate according to similar principles but are not compatible with each other. By the mid-1990s, digital cellular systems were widely deployed. GSM is used primarily in Europe but also in Japan and the United States. IS-54 is also used in the United States and elsewhere in North America. Similar underlying technologies, particularly high-speed digital signal processing, video compression, and audio compression, are used in the direct broadcasting satellite business, which is among the most rapidly developing consumer technologies. Low-Earth-orbit satellite networks are close to commercial operation and, if successful, will provide access to disaster-stricken remote areas where there is no cellular coverage. The prices are relatively low compared to those for today's satellite systems but are high enough that competition with a terrestrial system will be difficult. Therefore, many see these satellite services primarily as a way of filling in the gaps in terrestrial cellular coverage in remote areas. Another interesting development is Part 15 ad hoc networks. Part 15 of the Federal Communications Commission (FCC) rules applies to low-power unlicensed devices. Certain segments of radio spectrum are set aside for use by low-power devices that meet a relatively simple set of technical requirements. Metricoms's Ricochet modems are an example of a Part 15 ad hoc network that employs a mesh network topology. Efforts are also finally under way to set wireless standards for the next generation of wireless telephony, which, given the multitude of possible design choices in digital systems, is important. This is an important issue for emergency communications because interoperability problems inhibit rapid network deployment. Historically, the wireless industry has been characterized by proprietary protocols, and getting true inter-

OCR for page 13
Page 21 operable standards has been difficult, except when they are championed by large companies that are still licensing the technology. Advances in digital wireless have been enabled by four important technologies. Spread-spectrum technologies simplify spectrum management and can enhance privacy. Because the industry is close to the theoretical channel capacity limits established by Claude Shannon in the 1940s, low-bit-rate voice coding is increasingly important. Error-control coding is another enabling technology that maximizes system capacity. In addition, application-specific integrated circuits have been crucial to making these systems work efficiently at low power. Further increases in system capacity will come at high costs. Companies could deploy more and smaller cells, use directional antennas, or implement more flexible channel management strategies. A recent FCC mandate to improve capabilities for pinpointing the positions of cellular telephones when they are used to report emergencies of course has direct implications for crisis management. Existing technologies can only identify in which cell the caller is located. Particularly relevant to crisis management is the provision of data services by wireless carriers. In the early 1990s, carriers developed cellular digital packet data (CDPD), an overlay for the existing AMPS analog network, to provide some basic capability to send Internet Protocol (IP) data packets over cellular frequencies.9 Although CDPD is becoming more widely available, it is still not supported in many rural areas. CDPD systems are also slow, and the wider the area covered, the slower a system will be. Furthermore, CDPD is expensive; charges when the service was first offered were about 15 cents per kilobyte. The low adoption rate was interpreted as being indicative of low demand for wireless data services. CDPD is now being sold by carriers on a flat-rate basis, and its use is increasing. The potential exists to provide support for IP packet data in digital cellular services. The existing infrastructure generally does not support this capability, in part because the transition to digital services was managed for fast deployment of voice-only service. This situation is beginning to change. A related trend is the development of new modulation and channel-access schemes specifically designed for packet data instead of voice. For 9Amateur packet radio was developed in the early 1980s in both terrestrial and satellite versions. For many years it has provided support for emergency and disaster communications. Today, as cellular telephones and other commercial systems are meeting most of the operational requirements for disaster communications, the primary role of amateur packet radio has shifted toward technical experimentation and education.

OCR for page 13
Page 22 example, Qualcomm's new high data rate technology is somewhat like an asymmetrical digital subscriber line technology for cellular systems. Instead of guaranteeing a particular quality of service, these systems perform the best they can in current conditions, optimizing overall system throughput. Trends in Wearable Computers Daniel Siewiorek of Carnegie Mellon University discussed trends in wearable computers. He demonstrated an early-generation wearable computer that was designed in about 1994 and supported a marine in performing a 600-element inspection of an amphibious tractor. This system, which employed a head-mounted display to replace a clipboard, was awkward to use in many situations. It did not use voice input, which might be overheard by an enemy, but relied instead on a keypad interface. Field studies showed that the wearable computer saved 70 percent of the time needed to perform an inspection and enter the data into a logistics computer that would then generate work orders for mechanics. To indicate the possible roles of wearable computers, an analogy between computing and electrical motors is useful. About 100 years ago, big dynamos produced energy, and people brought their work (e.g., drill presses) to the dynamos. Later, the fractional-horsepower motor was invented, and it could be incorporated into an individual drill press and moved out into small job shops.10 That change was analogous to the transition from mainframe to desktop computing. Today, a car may have 50 electric motors, which pop the gas tank lid, run the windshield wipers, lock the doors, and so on. Their function is transparent to the user; there is no need for a 500-page user's manual to unlock a car. Wearable computers are likely to follow analogous trends toward pervasive deployment of computer devices. One forecast is that a user might have five IP addresses assigned to his or her body. As electronics become faster, smaller, and more portable, human factor issues are becoming more important, because it is not yet known how humans will interact with wearable technology. A considerable amount of experimentation is under way in this area. For example, researchers at Carnegie Mellon University have built 16 generations of wearable com- 10An historical analysis of how this change in organizational practice—the shift to using individual motors—was instrumental in realizing significant gains in manufacturing productivity is given in Paul A. David. 1990. "The Dynamo and the Computer: An Historical Perspective on the Modern Productivity Paradox." American Economic Review, 80(2):355–361.

OCR for page 13
Page 23 puting over the past 8 years and have learned much about critical factors affecting wearability such as placement on the body. Placement at some regions of the body may be more favorable because a device will move less as a person goes through the motions of a task. On the other hand, the degree to which device weight and thickness affect task performance and comfort can vary with body location. Body heat and device heat conduction also can affect wearer comfort significantly. A wearable device can act as a vapor barrier, affecting the comfort of a wearer working on an airplane in a hot environment. Intel Corporation discovered that a person's lap is more sensitive to dissipated heat than the fingers. Laptop computers are now designed to dissipate heat without making the user feel uncomfortable, for example by dumping heat through the keyboard. Researchers have found that users tend to have high expectations for wearable devices. The user of a wearable computer is much less patient than one using a desktop model, expects an instant response to inputs, and wants the computer to be as easy to use as a flashlight. The demand is for a device that a user can simply turn on and operate, without recourse to a user's manual. Siewiorek also noted potential hazards in the use of this technology. Given too much information, the user may focus too heavily on the computer and lose touch with the physical world. Interaction design is also a significant issue. Users may also lose initiative, doing only what the computer tells them to do. Applying Moore's law to the computing power needed to support human interfaces, one can predict the performance and styles of interfaces that will become feasible. In the early 1980s, computers could perform about 1 million instructions per second (MIPS), enough to support a textual alphanumeric interface with a keyboard. Graphical user interfaces with a mouse and icons became supportable when processor speeds reached 10 MIPS. Handwriting recognition systems require 30 MIPS; speech recognition systems, about 100 MIPS. These latest interfaces—speech synthesis output, multimedia data types—may take some time to develop, potentially requiring 5 to 10 years to develop data representations for three-dimensional gesturing, position sensing, and stereo visual and audio output. Energy is a key factor driving wearable computer technology. Indeed, more than half of the weight of today's wearable devices is in batteries. Projections show that it is possible to reduce energy use by an order of magnitude, but that as this is done, the fraction of the total energy used by the various system components shifts. For example, as computing becomes more efficient, the radio uses a much greater proportion of system power, and the energy needed to transmit data becomes a greater factor.

OCR for page 13
Page 24 The type of information being sent has a major impact on energy needs. A National Research Council study looked at how much battery weight is needed to acquire and transmit a particular piece of information.11 For example, it is estimated that about 1 / 100th of a gram in battery weight is needed to perform speech recognition on voice input and transmit the information as text, and about 1/2,000th of a gram to digitize and transmit voice as audio. But compression and transmission of video would require about 10 grams of batteries. The battery weight required to distribute real-time data with full-color video among mobile users in the battlefield would be quite large. 11National Research Council. 1997. Energy-Efficient Technologies for the Dismounted Soldier. National Academy Press, Washington, D.C.