National Academies Press: OpenBook
« Previous: INTEROPERABILITY
Suggested Citation:"USER INVOLVEMENT." National Research Council. 1987. Critical Issues in NASA Information Systems: Final Report to the National Aeronautics and Space Administration. Washington, DC: The National Academies Press. doi: 10.17226/10447.
×
Page 35
Suggested Citation:"USER INVOLVEMENT." National Research Council. 1987. Critical Issues in NASA Information Systems: Final Report to the National Aeronautics and Space Administration. Washington, DC: The National Academies Press. doi: 10.17226/10447.
×
Page 36
Suggested Citation:"USER INVOLVEMENT." National Research Council. 1987. Critical Issues in NASA Information Systems: Final Report to the National Aeronautics and Space Administration. Washington, DC: The National Academies Press. doi: 10.17226/10447.
×
Page 37
Suggested Citation:"USER INVOLVEMENT." National Research Council. 1987. Critical Issues in NASA Information Systems: Final Report to the National Aeronautics and Space Administration. Washington, DC: The National Academies Press. doi: 10.17226/10447.
×
Page 38
Suggested Citation:"USER INVOLVEMENT." National Research Council. 1987. Critical Issues in NASA Information Systems: Final Report to the National Aeronautics and Space Administration. Washington, DC: The National Academies Press. doi: 10.17226/10447.
×
Page 39
Suggested Citation:"USER INVOLVEMENT." National Research Council. 1987. Critical Issues in NASA Information Systems: Final Report to the National Aeronautics and Space Administration. Washington, DC: The National Academies Press. doi: 10.17226/10447.
×
Page 40
Suggested Citation:"USER INVOLVEMENT." National Research Council. 1987. Critical Issues in NASA Information Systems: Final Report to the National Aeronautics and Space Administration. Washington, DC: The National Academies Press. doi: 10.17226/10447.
×
Page 41
Suggested Citation:"USER INVOLVEMENT." National Research Council. 1987. Critical Issues in NASA Information Systems: Final Report to the National Aeronautics and Space Administration. Washington, DC: The National Academies Press. doi: 10.17226/10447.
×
Page 42
Suggested Citation:"USER INVOLVEMENT." National Research Council. 1987. Critical Issues in NASA Information Systems: Final Report to the National Aeronautics and Space Administration. Washington, DC: The National Academies Press. doi: 10.17226/10447.
×
Page 43
Suggested Citation:"USER INVOLVEMENT." National Research Council. 1987. Critical Issues in NASA Information Systems: Final Report to the National Aeronautics and Space Administration. Washington, DC: The National Academies Press. doi: 10.17226/10447.
×
Page 44
Suggested Citation:"USER INVOLVEMENT." National Research Council. 1987. Critical Issues in NASA Information Systems: Final Report to the National Aeronautics and Space Administration. Washington, DC: The National Academies Press. doi: 10.17226/10447.
×
Page 45
Suggested Citation:"USER INVOLVEMENT." National Research Council. 1987. Critical Issues in NASA Information Systems: Final Report to the National Aeronautics and Space Administration. Washington, DC: The National Academies Press. doi: 10.17226/10447.
×
Page 46
Suggested Citation:"USER INVOLVEMENT." National Research Council. 1987. Critical Issues in NASA Information Systems: Final Report to the National Aeronautics and Space Administration. Washington, DC: The National Academies Press. doi: 10.17226/10447.
×
Page 47
Suggested Citation:"USER INVOLVEMENT." National Research Council. 1987. Critical Issues in NASA Information Systems: Final Report to the National Aeronautics and Space Administration. Washington, DC: The National Academies Press. doi: 10.17226/10447.
×
Page 48
Suggested Citation:"USER INVOLVEMENT." National Research Council. 1987. Critical Issues in NASA Information Systems: Final Report to the National Aeronautics and Space Administration. Washington, DC: The National Academies Press. doi: 10.17226/10447.
×
Page 49
Suggested Citation:"USER INVOLVEMENT." National Research Council. 1987. Critical Issues in NASA Information Systems: Final Report to the National Aeronautics and Space Administration. Washington, DC: The National Academies Press. doi: 10.17226/10447.
×
Page 50
Suggested Citation:"USER INVOLVEMENT." National Research Council. 1987. Critical Issues in NASA Information Systems: Final Report to the National Aeronautics and Space Administration. Washington, DC: The National Academies Press. doi: 10.17226/10447.
×
Page 51
Suggested Citation:"USER INVOLVEMENT." National Research Council. 1987. Critical Issues in NASA Information Systems: Final Report to the National Aeronautics and Space Administration. Washington, DC: The National Academies Press. doi: 10.17226/10447.
×
Page 52
Suggested Citation:"USER INVOLVEMENT." National Research Council. 1987. Critical Issues in NASA Information Systems: Final Report to the National Aeronautics and Space Administration. Washington, DC: The National Academies Press. doi: 10.17226/10447.
×
Page 53
Suggested Citation:"USER INVOLVEMENT." National Research Council. 1987. Critical Issues in NASA Information Systems: Final Report to the National Aeronautics and Space Administration. Washington, DC: The National Academies Press. doi: 10.17226/10447.
×
Page 54
Suggested Citation:"USER INVOLVEMENT." National Research Council. 1987. Critical Issues in NASA Information Systems: Final Report to the National Aeronautics and Space Administration. Washington, DC: The National Academies Press. doi: 10.17226/10447.
×
Page 55
Suggested Citation:"USER INVOLVEMENT." National Research Council. 1987. Critical Issues in NASA Information Systems: Final Report to the National Aeronautics and Space Administration. Washington, DC: The National Academies Press. doi: 10.17226/10447.
×
Page 56
Suggested Citation:"USER INVOLVEMENT." National Research Council. 1987. Critical Issues in NASA Information Systems: Final Report to the National Aeronautics and Space Administration. Washington, DC: The National Academies Press. doi: 10.17226/10447.
×
Page 57

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

V. USER ILVEMENT The fundamental requirement for an information system is to support user needs. The complete information system, consisting of instruments, data systems on the satellite, data downJinks, and processing on the ground, must acquire, manage, and distribute the data. It wild take a massive effort to have a data system in pi ace for systems such as the Earth Observing System (EOS) in 1995. It is difficult to scale the overall effort very weld until the systems planning and component analyses are more advanced. However, it is cd ear that OSSA wild need to apply considerable information systems resources to complete the task. OSSA has shown over a lengthy period that i its information systems users. It was at OSSA' that the Space Science Board (SSB) of the Nations Commission on Physical Sciences, Mathematics, anc the Commi tame - ~ ~ - - ~ values the viewpoints of s request, for example, Research Council's Resources established . .__ ~ ~ ~11- ~V111~~ ~O ~ I On I ~ ~UUI'I^U J I r1 1~ / ~ . The CODMAC's continuing charge is to examine the management of existing and future data acquired from spacecraft and associated computations in the space and earth sciences, and to make recommendations for improvements from the perspective of the scientific user. on Data Management and Comoutation fEnnMArl in lq7R In its 1982 report,* CODMAC defined a set of principles and recom- mended that those principles "become the foundation for the management of scientific data." The first of the CODMAC principles reads as follows: "Scientific Involvement. There should be active invo~ve- ment of scientists from inception to completion of space missions, projects, and programs in order to assure production of, and access to, high-qua~ity data sets. Scientists should be involved in planning, acquisition, processing, and archiving of data. Such involvement will Data Management and Computation, Volume 1: Issues and Recommendations; Committee on Data Management and Computation, Space Science Board, NRC; National Academy Press, Washington, D.C.; 1982. 35

maximize the science return on both science-oriented and application-oriented missions and improve the quality of applications data for application users." In its second report,* CODMAC notes that progress had been made on the recommendations in its first report. One example is the initiation by the ISO of the pilot data systems, which directly involved the science community. However, CODMAC also noted its concern that neither NASA nor the space science community seem to be postured "to efficiently implement geographically distributed information systems..." It observed that NASA and the science community, with strong leadership from NASA, "need to work together to achieve the common goad -- to maximize the scientific return from space science data." To this end, CODMAC recommended that NASA estab- Jish a high-level advisory group, consisting of experienced data users (scientists) and experts in the relevant technologies, to advise senior NASA officials "on matters of data policy (and) computation and data management practices." It is difficult to know just how far to go in promoting user involvement in the entire information systems process. Clearly, OSSA has a tremendous experience base, especially that part embodied by the users, for the development of its information systems. The question raised by CODMAC (and by several user-oriented members of this Committee) is whether OSSA is taking full advantage of that which is available to it. At the same time, the Committee recognizes that OSSA must maintain control over its systems and related programs, plans, operations, and management processes. Therefore, the fo1 lowing is considered to be an issue that requires further study: Issue #3. To what extent should users be involved in the development of and changes to information systems, while still maintaining OSSA control? Users of Space-Derived Science Data. According to the second CODMAC report, there are two types of users of space-derived science data: -- Primary users are the principal investigators (PIs) who develop instrumentation, their co-investigators, and researchers and students who work directly with the PIs. The term "primary users" also applies to members of research teams who obtain data from a remote sensing instrument. The primary users, in general, receive data from their instrument directly from a mission data system. Issues and Recommendations Associated with Distributed Computation and Data Management Systems for the Space Sciences; Committee on Data Management and Computation, Space Science Board, NRC; National Academy Press, Washington, D.C.; 1987. 36

Secondary users are scientists actively engaged in research in a given discipline, but who are not directly associated with a given instrument. Secondary users may or may not be directly associated with a particular mission. Secondary users usually receive data from an archive. Also: Primary users become secondary users when they want to use data from an instrument other than the one with which they are associated. Another example of secondary users is scientists associated with one spacecraft who wish to do correlative studies by using data from another spacecraft. In fact, all users of correlative data are secondary users.* In most cases, guest investigators are considered as secondary users. Most commercial users of space derived data are secondary users. EOS Data Users. According to the EOS Data Panel, there will be at least four types of major users of that system:** 1. Instrument team members and support personnel associated with EOS instrument or mission operations centers. They will need to moni- tor a sampling of data continually in near-rea] time for quality assurance, error detection, and instrument malfunction assessment. They should have the capability to reconfigure observational sequences when malfunctions of special events occur. 2. Researchers, instrument team members, or operations-oriented personnel who need instrument-specific, near-rea] time, or rea1-time data processing, delivery, and display capabilities. Some of these, such as the National Oceanographic and Atmospheric Administration (NOAA) and the Department of Defense ( none menu require large data volumes. . . ~ , , — ~ Correlative data are data that are processed in a standard way, are distributed to all interested scientists, and are used to interpret data from spacecraft. An example of this is ground magnetic data used for correlative studies in solarterrestria] physics. ** Report of the Eos Data Pane] (Robert R. P. Chase, et al.), NASA Technical Memorandum 87777, Volume Ila, 1986, pp. v-vi.

3. Researchers who will need to interrogate directories and catalogs of EOS and other relevant data on an instrument, geographic Joca- tion, and time of acquisition basis. They will need to order data, and in some cases they wild request particular observational sequences from EOS instruments. 4. Other researchers also will need to interrogate EOS and non-EOS data directories and catalogs, but they are distinguished from the previous group by the need to browse EOS data visually through attributes or expert systems to find particular features, attri- butes, or special cases. OSSA-User Interaction. During this study, the Committee reviewed background material and documentation pertaining to several missions planned for the 199Os involving astronomy and astrophysics, planetary sciences, solar terrestrial physics, atmospheric sciences, and land resource sciences. The Committee found that al] too frequently OSSA involves users early in the information system design phase, but does not maintain a continuing dialogue with them during the development phase. There was a strong impression, even though the Committee was sure it was not intended, that OSSA tends to treat each mission as a new start for information systems development. The International Solar-Terrestria] Physics program is a case where users have been involved in the planning in an iterative way. In that program members of the science community worked with NASA officials to design the data system at the same time the rest of the project system was designed. NASA officials kept the users informed of the constraints imposed by limited resources. They worked with the scientists to deter- mine the trade-offs involved for the spacecraft, instruments, and data system, and the implications of these trade-offs on the science return from the missions. The result is a data system plan which meets all of the user requirements but is stir] very modest in scope and cost. An example in which OSSA does not appear to be interacting effectively with the users is the design of the high resolution imaging spectrometer (HIRIS), which is part of the EOS instrument package. The users requested narrow bands and a wide selection of bands. The first report of over 50 possible bands was met with approval from the user community. Later announcements of ilS channels--and more recently of over 220 channels of data--were met with amazement. The Committee was unable to determine the basis for this planning, but perhaps it is based on other user requests. It seems likely to the Committee that earth resources scientists wild ask for 5-7 channels of data at one time, with occasional requests for up to 15 or 20 channels, but it will be rare to see a request for 220 channels. The data rate of instruments will have a large impact on the information system needed to provide the data and information to the user. While the Committee believes the supporting information system should be responsive to the needs of the users, it hopes (in this example) that OSSA will not design an-information delivery system based on sampling al] of the 220 channels for one scene. 38

A number of strong arguments can be developed in support of involving the users in the planning of information systems. The Committee feels the most important of these might be to control costs by determining the limits of the data capability that is provided to the users. There must be sufficient data capability to obtain the appropriate scientific, human-interest, and applied use of the data. This must be balanced against the need within NASA to save money where practical, so that as many valuable missions as possible can be flown. The Committee heard reports emphasizing that it does not make sense to fly a satellite if reasonable use of the data is not funded. The same is true of the data system. At some point costs exceed benefits and a limit to the data system should be defined, at least to the extent possible. Such cost- benefit analyses cannot be rigorously performed in all cases, but the exercise of working with the user community to define appropriate con- straints would stand a good chance of providing the information needed for evaluation purposes. To help decide what level of effort is appropriate, OSSA needs to know who the users are, what uses will be made of the data, and what scale of user support is appropriate for a given data set. Some of the Pilot Data Systems provide more information than the users can absorb. Many critical research data sets are, in fact, not used by large numbers of people. For example: A popular set of twice-daily, southern-hemisphere atmospheric analyses from Australia covered a lO-year period. Over a 4-year period, copies were sent by the National Center for Atmospheric Research (NCAR) to about 40 people at universities and other research laboratories. Another estimated 20 scientists used it on-1 ine. Since the first rush of research on the data, the use has dropped off to about five requests per year, though a number of people probably are stir] using their data copies at home. The most popular Nimbus data set was a set of two or three tapes having ozone samples covering 200 km along orbital tracks, active for several years. NASA mailed out copies of these tapes to 70 users. Many satellite data sets will be used by a few people (perhaps i-10) during a several-year period while the main research is going on. Then the data will be relatively dormant while the science waits for periodic new ideas and new questions. For such data, it is mandatory to have data sets that are wel1-structured, we11-documented, and in a catalog. In many cases, it is not useful to spend very much money in advance of the need to develop novel ways to display a particular data set. The data set situation is analogous to that of books in a library. Many essential documents are not used very often. The National Climate Data Center, at Asheville, North Carolina, deals with data that has a wider interest than the above data. It receives about 50,000 requests a year, mostly for small amounts of printed data.

Most requests can be satisfied for costs of $3 to $15 each. In addition, many publications are distributed by subscription. Requests that demand significant resources are much smaller in number. Only about 1,300 requests each year are for digital data. About 4,000 tape copies are mailed each year. Sometimes these tapes go into other archives, where they are available to even more users. The first archive is then like a wholesaler. Many commercial firms now help to distribute weather data. The Committee has seen that a given scientific data set may have only 5 to SO users over a few years. However, the scientists who know most about these data may produce derived data sets that are easier for other people to use. Examples are sea-surface temperature, atmospheric ana~y- ses, ice concentration, and pictures. It is these products that usually will be used in interdisciplinary science. Some of the derived figures, summaries and pictures will go into thousands of copies of textbooks and popular books. Just as the system throughput must be taken into consideration when designing the supporting information system, so should the limits of the data systems be considered in spacecraft and instrument design. Most scientists recognize the need for trade-offs between the data system costs and research funding, and they are willing to participate in the develop- ment of suitable compromises. They have a vested interest in the mission and they have considerable experience and expertise to offer. Such trade- offs and compromises are cheapest if they are worked out during the mis- sion planning stage, rather than later. Through its briefings from NASA officials, the Committee also learned of several initiatives within OSSA's domain to find common elements in satellite data systems, so that generic systems could be designed for use on such missions. This is eminently sensible, since it can save money and it can lead to an approach that will capitalize on past successes while avoiding the pitfalls of past failures. It appeared to the Committee that such efforts were particularly we11-developed at JPL, where common agree- ment is reached by forming a working group of the appropriate experts from various flight projects and the JPL ISO. If OSSA can expand these initia- tives to involve its information systems users without compromising sched- u~e and cost constraints, a fairly rapid solution to this issue might evolve. Another factor to be considered involves NASA's relationship with operational data users such as NOAA, the U.S. Geologic Survey (USGS), the U.S. Department of Agriculture (USDA), and other government agencies, as well as with commercial users of space data. These communities must also be considered when NASA develops new sensor technologies, since the result- ing data and data products will ultimately become their responsibility. The agricultural industry, for example, needs data based on economic trade zones, not just county boundaries, and the petroleum industry has special interests in geologic profiles. These operational requirements need to be included with NASA's science and technology research objectives, to make certain the basic data is available from which new and more useful types of data products can be prepared. 40

VI. INFORMATION SYSTEMS TECHNOLOGY Over the past quarter century, OSSA has employed a mission-oriented approach to the collection of data in support of a variety of science applications. During each mission, data were collected to meet the requirements of a small group of scientists in a particular discipline, using a data system that had been developed for that purpose. However, as noted earlier, there is an increasing need for interdisciplinary and multi- discip~inary scientific work that will change the way information systems are structured. OSSA knows that as it moves into the Space Station era mission and discipline boundaries wild blur, and huge volumes of data will be collected by NASA and others to support a large number of interdisci- plinary projects involving hundreds of scientists. OSSA has already initiated comprehensive planning for such missions and for the challenge of information systems that can handle the huge volumes of data and the product requirements of the users. As an example, the Committee is concerned that even with efficient data-rate management and control, the current digital magnetic recording and compact disk (CD) read-on~y memory (ROM) technologies cannot cope with anticipated data rates in the Space Station era. Further, commercial database management systems currently do not have the features required to manage large volumes of space-derived data. These technological problems are compounded by such management and operational considerations as the need to control costs (which potentially affects OSSA's ability to support the users) and the need to support the users (which influences costs). Therefore, the Committee suggests the following as the final major issue to be addressed by OSSA in the context of this study: Issue #4: How can the projected information systems technologies keep pace with future sensor outputs? After reviewing the technology requirements of NASA information sys- tems in the Space Station era, the Committee believes the specific areas of technological concern are: 41 i

the trend toward development of higher data-rate instruments for use in remote Earth sensing, taking into consideration the actual user need and constraints imposed by information systems techno1- ogy and costs (discussed below); the ability of current digital magnetic recording and compact disk (CD) read-only memory (ROM) technologies to cope with anticipated data rates in the Space Station era in support of on-board process- ing, space-to-ground transmission, "1eve1-zero" processing (that is, data that have been corrected for telemetry errors and decom- mutated), and the storage and retrieval of data (discussed below); the ability of commercial database management systems to manage large volumes of space-derived data (discussed below); the need for cohesive planning and a unified approach to the creation and control of software (discussed in Section Ill of this report); and the fragmented and mostly incompatible data transfer and electronic communication between elements of the OSSA and the user community, which makes data and information transfer difficult (discussed in Chapter IV). The Trend Toward ~ . Scie ~ much data they can effec- tive~y evaluate. Most users will want data over a small test site or a sampling of the data to meet their scientific needs. The Committee does not believe information systems should be designed to provide all data acquired by the high data-rate instruments, unless there is an overwhe~m- ing scientific justification. A careful cost-to-benefit analysis should be made before designing a data system for the high-data-rate instruments. Some data sensors have the capability of drowning data systems with so much data that costs become unreasonable and technology may not even be able to cope with the data stream. If all data is saved, one cannot afford to extract the information that is really needed. Several sensors such as the HIRIS and the synthetic aperture radar (SAR) that will be part of EOS dominate the planned data volume environment. Technology improve- ments should enable NASA to increase the cost effectiveness of handling new data by a factor of ten by 1995. However, it appears that data gath- ering may increase by much more than the technology gain unless careful plans are made for use of the high-rate sensors. The data rates for SAR tabout 300 megabits per second (Mbps)] and HIRIS (up to 900 Mbps) compare with data rates to good computer disks of 24 Mbps, and Cray supercomputer specia1-channe] speeds of i,OOO Mbps. With data rates even a fraction of these, one must establish a mechanism to cope with questions of what sampling and data archiving make sense. The strategy should include a projection of year-1995 technology and costs, and an effort to drive data storage costs down. With lower storage 42 1'

costs, it is reasonable to save more data that will be used for local studies and case studies for short time periods. Table ~ and Figure 4 summarize data rates from selected instruments during the next 10 to 15 years (also see Table 2 and Figures 5 through 7 at the end of this chapter). Another class of studies of growing importance requires processing data over a number of years. If all OSSA does is to save high-volume data for many years, it still cannot be used for such studies because it costs too much. Often data need to be sampled in several ways, such as the one- kilometer (high resolution) and four-kilometer resolution (global survey) data that is routinely supplied to NOAA. A common satellite data rate of 100 ki~obits per second (kbps) pro- duces 3,160 x 109 bits per year, or 3,160 high-density tapes [6,250 bits per inch (BPI)] each year. An individual PI usually can cope with only 20 to 100 tapes per year. A data center usually charges $60 to $100 per tape copy, and then it often costs the PI even more to process it. The International Satellite Cloud Climate Programme is now sampling data from several geosynchronous-orbit satellites and one polar-orbiting satellite to reduce the archive from about 60 x 1012 bits per year to two archives, one of about 500 tapes (500 x 109 bits per year) and the other of about 100 tapes per year. The international processing unit at the Goddard Institute for Space Studies is able to process the smaller of these two archives to derive cloud statistics. While the above data rate of 60 x 1012 bits per year has posed a difficult problem for long-term studies, it should be noted that the composite data rate being planned by NASA for 1995 is more than 50 times greater (see Figure 7~. In EOS, a NASA division proposed to limit the onboard system to hand aggregate instrument rates not over 20 Mbps. The limit is under debate. Other very-high-rate sensors such as SAR and HIRIS wild be handled sepa- rately. The Committee thinks this NASA strategy is wise. The high data rates demand more careful attention to decide what sampling strategies and data rates make sense. The main uses for SAR are ocean wave statistics, ice coverage and location, and J and resource studies. To obtain ocean waves, one needs only a small, square array of samples located 100 or 200 km apart from each other, perhaps closer together in coastal waters or near a major storm. As indicated above, it seems likely that user requests for HIRIS channels will be rather modest compared with the capa- biJity now being planned. The HIRIS instrument has similarities to instru- ments on Landsat and the European SPOT. Comparisons should be made with the data rates, duty cycle, archive strategies, and costs of these older systems, as part of the process of defining the data system for HIRIS. In forming sensing requirements, it would be helpful if OSSA would provide feed-back to the users on the costs for different options in order to arrive at a good balance of costs and benefits. also, the plans for future data rates and archives should factor in better technology. It is 43 1 ~ le

anticipated that there will be an increase in storage cost effectiveness and of computing capability (per-unit cost) by a factor of 10 or 15 by 1995. However, one cannot plan for 100 times more archiving by 1995--when technology is projected to be perhaps only 10 times better--without carefully evaluating costs and benefits. Since the Committee did not have the time to study this matter in great detail it can do no more than suggest that it be given careful review. In particular, the Committee is concerned that the likely budget cuts in the foreseeable future will mean that increased funds for sophis- ticated, and therefore expensive, information systems will come at the expense of investigator, instrument, and spacecraft portions of the pro- grams. When data requirements are being discussed, there will always be some good reasons for better space resolution, more samples in time, and more channels. However, users do not need the highest-resolution data al] of the time. We believe that achieving a balance between data and infor- mation systems and other aspects of the programs is essential. Limitations of Current Digital Magnetic Recording and Compact Disk . (CD) Read-On~y Memory (ROM) Technologies. Even with efficient data-rate management and control, the current digital magnetic recording and CD-ROM technologies cannot cope with anticipated data rates in the Space Station era. OSSA needs to examine and support, to at least a limited extent, the development of alternate storage technologies, to support high throughput rates and capacities. Hybrid analog and digital recording formats and optical video disks similar to laser-vision disks are examples of a~ter- nate technologies that can be exploited. A careful examination of continuous-throughput data-rate requirements for high-data-rate sensors is needed to reduce data volumes to a manage- ab~e level that is both consistent with user requirements and affordable. OSSA, in conjunction with the user community, should develop techniques (including data compression and on-board data extraction techniques) to reduce the data throughput requirements to a J eve] consistent with contemporary technologies that are commercially available or expected to be developed commercially in the near term. In reviewing the requirements of the first three technologies listed on the proceeding page, the Committee adopted the assumption that contin- uous throughput requirements will vary from lob to lO9 bits per second (bps) in the 1990 time frame (see Figures ~ through 7 and Table 2 at the end of this section). The focus is on continuous rather than burst data rates, since the tote] cost and complexity of the information systems will to a large extent be determined by the continuous throughput requirements. For data rates up to lob bps, technology currently exists for space-to-ground transmission, and for processing, storing, and distributing data electronically to most users. At this rate, data can be processed ([eve] zero), archived using magnetic media, and distributed to users in read time using commercial transmission facilities. OSSA missions with non-imaging sensors or low-resolution imaging devices have 44

continuous throughout rates of the order of 106 bps. These missions generate up to 1o1 bits per year, and the data can be stored in about 10~000 physical storage units (PSU) such as tapes' disk packs' etc. Input/output (I/O) rates of 100 bps are easily available with tape and disk drives, and communications links operating at 1.544 Mbps (commonly called "T1" links or carriers, in reference to their commercial tariff designation) can be established easily at user locations for data distribution. Processing speeds of 10 million instructions per second (MIPS), or up to 100 instructions per byte of data, will be needed for level-zero processing. Such speeds are currently available. Increasing the throughput requirements to 107 bps will stretch the current capabilities in some areas. One exception is the space-to-ground link, in which capacities of 100 Mbps are currently available. While magnetic recording media can handle I/0 rates of 107 bps, the annual volume of 1014 bits will require over 100,000 tapes per year (CD ROMs cannot handle input rates of 107bps). Near-real-time processing and distribution of data to users still might be feasible as long as a single user does not demand access to all the data over extended periods of time. Processing speeds of 100 MIPS to handle level-zero processing, as well as storage requirements of over 100,000 PSUs per year, present some major problems using projections of current technology. Data rates of the order of 108 bps present possibly insurmountable problems and challenges. Processing speeds of over 1,000 MIPS, and I/0 rates of 100 Mbps into and from storage media, are difficult to achieve unless parallel-processing techniques are used. Even then, the number of PSUs will be of the (unmanageable) order of 106 units per year. Near real time distribution of data to users may not be economically feasible at these rates. We do not anticipate an exponential growth in the I/0 rates and stor- age capacities of magnetic media (or CD ROMs), or throughput rates of con- temporary production networks. Specially designed multichannel magnetic recorders or very-high-speed integrated circuit (VHSIC) memories may pro- vide a means to capture and process short bursts of data at rates of 108 bps. However, current technology, as well as what is projected to be available in the time frame being considered, cannot support the process- ing, storage, and distribution of data at sustained rates of 108 bps or higher. The need for data rates of 108 bps or higher originates from high- resolution imaging sensors, such as multichannel spectral scanners (MSS), thematic mappers (TM), and synthetic aperture radars (SAR). There are two possible solutions to the probe ems created by these high-data-rate sen- sors. First, image data is highly redundant and data-compression schemes can be used to reduce the data rates by almost one to two orders of magni- tude. Commercial coder-decoder (CODEC) devices are currently used in a variety of applications for data compression and reconstruction. In NASA systems, compression may take place on the space platform or on the ground where the level-zero processing is done. Fairly simple spatial and 45

spectral compression algorithms can be applied to data streams of BOB bps to reduce the rate to lob tO 107 bps. While compression algorithms have been developed and applied to MSS image data, new algorithms need to be developed for data from SAR and other sensors whose statistics are quite different from those of MSS data. Once successful algorithms are applied to the outputs of high-data-rate sensors, the resulting reduced data rates can be handled with existing technology. The development and application of data-compression techniques should be coordinated carefully with the user community, which traditionally takes the view that nobody should "mess around" with the data. They should be convinced that some trade-offs have to be made in order to maintain high throughputs over long periods of time. If the option to transmit uncompressed data over short periods of time, when needed, is maintained, the Committee believes that users can be convinced to accept compressed data (user involvement was discussed in Chapter V). The data-compression issue may have to be looked at in the broader context of data or bandwidth management. Issues such as compressing data onboard versus compressing it on the ground, and using an "expert system" onboard to extract information and make decisions about how much data from each instrument to transmit to the ground, need continued study and analysis. At the higher data rates (>108 bps), the onboard processing requirements to implement any kind of "expert system" might require processing speeds in excess of I'000 MIPS and may not be cost-effective. The cost trade-off between introducing additional processing requirements and savings that might result from reduced costs for storage and distribution must be analyzed carefully. An alternate approach is to consider analog (or hybrid) recording techniques for storage purposes. Consider, for example, a standard TV signal which has a bandwidth of about 5 Megahertz (MHz). If this signal is digitized, the data rate required will be of the order of 108 Mbps without compression. Digital recording at this rate for as little as an hour wild produce hundreds of digital magnetic tapes. However, several hours of the analog TV signal can be recorded on a single $4 VHS tape with a $200 recorder! Now, while digitizing facilitates easy multiplexing and transmission over long and noisy communication links, there are no signif- icant advantages that warrant digital recording. The CD ROM technology does not provide any attractive solution to high-volume, low-demand applications. It is most effective for low, continuous throughput and high demand (several hundred copies distributed) applications. The Committee sees promise in the use of commercially available recording technologies such as Jarge-bandwidth analog, hybrid magnetic recording, or optical technologies. While analog or hybrid recording using magnetic tapes provides high throughput and capacities, random access to recorded data is not yet possible. Laser-vision and Jaser-video disks offer capacities and throughputs that are much higher than those of CD ROMs. Even though the throughput and capacities of laser-vision and laser-video disks may not be as high as analog magnetic tapes, they do 46

provide random-access capabilities. The throughput and capacities of video disks are an order of magnitude higher than those of CD ROMs; hence, the video-disk technology should be monitored. Limitations of Commercial Database Management Systems {DBMS). Based on briefings from NASA personnel, the Committee understands that during the next decade, NASA's mission-specific data systems will be replaced by more generic, multi-discip~inary DBMSs. Data systems can be characterized as those where the users of the system are responsible for providing all desired management of the data, whereas DBMSs provide generic management capabilities as an integral part of the database system. The commercial world successfully underwent this transition some years ago, and it is evident that the engineering and scientific worlds are undergoing a similar transition today. Equally important, major standardization activities relative to DBMSs and associated capabilities (e.g., query languages, report writing facilities) are gaining in momentum. The advent of relationa1-based systems has been a major factor in the drive toward standardization and wild provide a vendor-independent base for future database management systems technology. The Committee also believes that OSSA and its constituent program and project offices should focus on using, to the greatest extent possible, commercial~y-available DBMSs or derivatives thereof, rather than spend excessive amounts of resources in developing their own. However, while commercially available DBMSs will provide a comprehen- sive set of data management facilities, there remain a number of areas in which these systems fall short of meeting the needs of the engineering and scientific communities for management of large volumes of space-derived data. In conjunction with NOAA, NSF, and the community of vendors and standards organizations, NASA/OSSA should focus on this shortcoming, and encourage the private sector and the standards organizations to develop appropriate solutions. Some of this is already being done: the agreement reached between NASA and NSF in NSF's supercomputer initiative is a major step in this direction. Many of the supercomputer centers wild be extend- ing commercially available database management systems to provide those facilities required for the target engineering and scientific communities. Additional efforts of this type are required. The Committee believes the major areas to be addressed are the following: l. Performance. Much of the past reluctance of the engineering and scientific communities to adopt commercially available DBMSs has been the Jack of numerically intensive computational performance available through the use of these systems. There has been an acceptance of this deficiency and much work is now underway to provide the necessary levels of performance. OSSA and its constituent user communities should quantify their performance requirements and make them known to vendors and other interested parties (e.g., the NSF). 47

2. Very large databases. Closely associated with the performance question discussed above is the question of the ability to handle very large databases. Traditional, commercially oriented DBMSs have not proven themselves to be particularly well suited to dealing with the massive amounts of data that normally are dealt with by the engineer or scientist. However, this shortcoming has indeed been recognized and much research is improve the ability of DBMSs to deal effectively databases, either directly or through the use of processors. currently under way to with very large auxiliary 3. Data definition capabilities. Commercial DBMSs have focused pri- marily on data-definitional facilities that have been oriented to the commercial world. These have proven not to be adequate for the engineering or scientific user. OSSA should understand better the needs of its user base in this area and transmit those needs to the appropriate standards organizations and vendors. 4. Data interchange. To achieve even a primitive level of interoper- ability, data interchange agreements must be formulated and agreed upon. These agreements or standards must be as non-constricting as possible; therefore, the Committee recommends that these stan- dards be based on the notion of self-defining data (that is, data wherein the definition of the content of the data record is con- tained within the record itself). While we saw some indication of a beginning of this in the EOS project, it needs to be focused upon on a much broader base with a much higher assigned priority. 5. Directories and carats. The Committee has previously noted in this report the central rode to be played by directories. We believe that effective and efficient directory management capabi, ities (including abilities to these directories) wild be a key factor in achieving systems interoperability. User requirements for both directory content and directory management should be gathered, analyzed, and submitted to vendors and appropriate standards organizations for consideration and adoption. 6. Distributed Systems. It is inevitable that NASA scientists will be involved at a globe] level with a hierarchy of systems, with much distribution of both data and processing being both desirable and necessary. Fundamental architectural decisions, accommodating heterogenous systems and vendors, should be dealt with immediate- ly. For example, will control information and responsibility be centrally managed or distributed? What will be the capabilities for shipping data to work and/or work to data? 48

PEAK DATA RATE TIME PROFI LE 1.1 1.0 0.9 on as Cal 3 o - ~ 0.6 - a: A: n 0.8 0.7 0.5 0.4 0.3 0.2 0.1 O I L// ; AudioMdeo Telemetry - ~ ~, ~ 1992 1993 /~/~ 1 994 1995 YEAR FI CURE 5 49 1996 1~ .;,

AVERAGE DATA RATE TIME PROFI LE 400 350 Q 300 - US C] hi: LU - 250 200 150 100 50 o ///~] Audio~ldeo Telexed _ _ : E~4 1992 1993 it. .;. .;... ..... :: .-. :.:.-. ::::::: . . .. .. . :. :~ ..... ....... ..~ ...... , l 1994 1995 1996 1997 1998 YEAR FIGURE 6 50

DAI LY DATA VOLUME TI ME PRO FI L E 4.0 3.5 On 3.0 m t o 2.5 2.0 At: 1.5 o 1.0 0.5 o ~///~3 Audio/V'deo Tel~W - 1992 1993 1994 1995 1996 1997 1998 YEAR FI CURE 7 51 :....--. .~ : :.: :~ ,~ .,

- ~ lo ~ lo lo Jo lo lo lo ~ lo lo lo =-o lo lo ~ ~ - - To CS) C~ C~ C~ O ~ ~ ~ 00 ~ O C0 ~ ~ O ~ ~ ~ LO ~ C~ · · ~ ~ · · ~ · ~ ~ · ~ · 1 CY) CS) ~)—,d" C~1 00 ~ CXJ ~ ~) ~I d" C5~ l O d" m) ~) LO ~) ~ Cx1 <5) ~ LO - 1 - 1 · I N ~— ~) ~ LO C0 ~ ~ ~ C~ ~ ~ C~ CO ~ __ __ r~ 0 ~ 0 0 0 0 0 0 0 0 0 ~ 0 0 0 LO ~ ~ O ~n c~ 0 ~ ~ ~ co 0 oo LO =) · · ~ · · · · ~ a · ~ ~ a · · ~ · · ~ 1 O C7) LD 1~ J a) ~) e~ ~) ~ d" ~ C~ l O ~ O O CN1 IS~ ~1 1:~1 CS) C~ LO C~1 CNJ ~ CN1 ~—~) 00 C~l C~ ) ~I d" CNJ ~1 LO C~l . 1 ~ ~ ~) O ~ O O O O O O O O ~) O O O LO d" 1 O ~1 C~a CS~ CN1 0 C~ ~1 00 0 00 ~) (~a O C0 d~ ~1 LO ~1 C~l C~ C7S · · a a! a · · · · · · · · a · · · · a ~ 1 LO O) ~ 1~ ) ~ ~ ~1 ~ C~ C~ J O d" 1 LO LO I CN.1 C5) ~ CN1 C~ ~~ ~) oo 0 ~) CY) .~ . ~ Q~ a~) O =) O O O O O O O O =) O O LO e t" _ ~1 <5) 00 O) C~a O ~) ~1 C O O 00 ~ ~a 00 ~ ~I Lt.) I~ O C~ ~ C~ · · e · · · a a · · · a · · · · 00 all ~1 O CS~ {S) I~ C~l 00 kD ~ ~ ~ ~ ~I C~l O d" {~ · L~1 C~1 LO ~—1 O) L£) CXJ CN1 · 1 r—C~) LO O ~) 00 C~ ~ C~l ~1 C~l —1 d~ 00 O .. . . ey d" O ~) O O O O O O (D O O Ln ~ ~1 O ~1 O =1 O CS! C~l O ~ O C: O ~) C~a CO ~ ~I LO O O O CD · · · · · · · a a · a · · · · a a-LI I ~ ~ ~ I~ C~ ~ ~ d~ N N O ~ 00 ~ C~l a~ ~ ~ Ln CN1 CNJ · I 1~ ~) ~1 00 C~ a ~1 __ __ __ _. O O O ~ Lt~) · 1 O ~1 cn ~ oo 0 ~ c~ ~ LO 00 r~ a~l ~ a a · · · a · ~ a— I 00 I~ N d" O C~l 00 ~) ~ r l ~1 ~ I ~ I C~a O O O O O O C~l CI5 CS, 00 0 ~) . 1 00 C~ N _ ~ ~ ~ ~ . ~ a O, l~l I CO ~— N 00 C0 CS) ~ ~ ~1 I I ~ =) _ ~ - O ~ ~) ~) O O O ~1 a~ O t~ O O O CNJ O O O · I N ~S 41) ~ r ~) L~) L~) O C~l I~ O d" O O O ~) LO O C7) O d" ~ ~ —) Q t—~ a ~ a a ~ ~ a2 ~S ~~S ~ f~ t~ <~ ~1 (~a 00 0 0 (S) O L(~ 1 1— ~ N O O a c~ ~ CY S · r— ~) I ~ - 1 ~ . I _a ~S a~ ~ ·' >) a? ~ O 1 - 0 w) LO ~I LO C~) C~) O O O ~ C+) O O . 1 ~1 ._ ~ ~ 1~ o ts) LO N ~1 N 1— C~J N O O O ~ N ~1 ~1 N N S 5 t_) _ a C) Ct ~ C~ ~ O O O O O O O 0 ~1 ~1 ~1 0 0 0 0 0 0 C~ Z 0 ~ ~ C) ~ O) 0 0 0 0 N O O O O O O O O O O ~) O ._ nt ~ ~ (J) , O O O O O O O O O O O O O O O O O ~) V' Cf) C~) O) ~ ~ Q Q ~ a ~ LLI I~J l~a Cf' ~ C) ~ =2 ~S O ~) O O ~l ~ O N LO ~ I ~I C~J O O CS) O N 1~ ~ ~ || I LC) ~I N ON || ~ || ~ | =] | i ~ %_ __ __ l=J c S: a~ O >) ~ ~ ·r- ~ ~ ~ a2 a~ ~ a~ al5 ct <~ a~ ~ 6~S ~ ~ a~ ~ a~ a2 a~ a~ O O <1) t~ C/) C~) ~ t~ ~) t~ a2 c~ C/) ~ ~ a~ ~) C~ C~ t~) C/) llJ I~J 1 Q O) ~ ~ ~ ~ ~ ~ CC O O a~ O cl O a2 a~ a" ~ a~ ~ S. ~ a2 ~ Z: Z Z ~ Z Z: Z ~ Z Z Z Z ~Z Z Z Z ~ ~ ~ ~ a2 O) ~ ~ ~ ~ L~ ~ CL CL ~ ~ O O O a~ l O O O O O O O O O O ~ CL O) ~ ~ ~ ~ c~ ~ cL ~ ~ Q ~ O O t~ · ·~ C~ C~) a~ aS a2 >~ ~S ~ ~ ~ a2 Cf) ~ ~) t' CJ) {f) U) ~ ~ ~ J Q G) ~) CJ) C/) ~ C/) ~ C/) ~ ~) O O O O O O O {~ C/) ~ a~ ct '_ 1 1 m- 1 o 11- 11 o 1 11 ~ a) ~ ~ ~ O) ·r- ~ ·r- ~ ~1 C] a O O I a" ~ C~ ~ Cf ~ 1= ~ O . r— ~ a~ ~) ~ ) I I ~ ~ ~_ t~ 11 —- a~ ~ ~S O O ~ I— ~ ~ O) C~ t~ ~ ~ CY O! a~ a~ ~ ~ ~ r— S~ ~ ~ I~ ~ O) (.~ a~ LLI Q O ~ I ~ ~ a~ I_ I I X l_ O ~ O) ~ a2 c~ a" ~ C/) I · ~ ~) a~ a~ ~ C' ~) I 1~ ~ ~ ~ ~ 1~ O . Z 52 . N CM O N ~n a) ·~ ~s a) a, ~a o __ X o o X a) C) a, a) ~a ~: 11

Appendix A REFERENCES Data Manaoement and Computation, Volume I: Issues and Recommendations, Committee on Data Management and Computation (Ralph Bernstein, et al.) National Research Council, National Academy Press, Washington, D.C., 1982. Earth System Science Overview: A Program for Global Change, Earth System science Committee (Francis Bretherton, et al.), National Aeronautics and Space Administration (NASA), May 1986. Issues and Recommendations Associated with Distributed Computation and Data Management Systems for the Space Sciences, Committee on Data Management and Computation (Raymond Arvidson, et al.), National Research Council, National Academy Press, Washington, D.C., 1986. Jenne, R., Planning Guidance for the World Climate Data System, World Climate Programme Office, WHO, Geneva, 1982. Kieffer, H., et al., Planetary Data Workshop, NASA Conference Pub. No. Report of the Eos Data Panel (NASA Technical Memorandum 87777, Volume Ila) Earth Observing System Data Panel (Robert R. P. Chase, et al.), Solar-Terrestrial Data Access, Distribution, and Archivin , Joint Data Pane] of the Committee on Solar and Space ~ cience Board) and the Committee on Solar-Terrestria] Research (Board on Atmospheric Science and Climate) (Margaret A. Shea, Donald J. Williams, et al.), National Research Council, National Academy Press, Washington, D.C., 1984. A Strategy for Earth Sciences from Space in the 1980s, Part I: Solid Earth and Oceans, Committee on Earth Sciences (Sean C. Solomon, et al.) National Research Council, National Academy Press, Washington, D.C., 1982. A Strategy for Earth Sciences from Space in the 1980 s and 1990 s, Part IT: Atmosphere and Interactions with the Solid Earth, Oc e ens , and Biodata, Committee on Earth Sciences (Ronald G. Prinn, et aged, National Research Council, National Academy Press, Washington, O.C., 1985. 53

Appendix B ACRONYMS AND ABBREVIATIONS USED IN THIS REPORT AVHRR BPI bps CCITT CD COP CODEC CODMAC DBMS DoD DOE ELV EOIS EOS ERBE ESSC Gbps GOES GSFC HIRIS I/O IF ISO I.ISO JPL kbps Mbps MHz MIPS MSS NAIF NAIS NASA NCAR NOAA NRC Advanced Very High Resolution Radiometer Bits per inch Bits per second International Telegraph and Telephone Consultative Committee Compact disk Concept Design Phase Coder-Decoder - Committee on Data Management and Computation (NRC) Data Base Management System - Department-of Defense Department of Energy Expendable Launch Vehicle Earth Observing Information System Earth Observing System Earth Radiation Budget Experiment Earth System Science Committee (NASA) Gigabits per second Geostationary Operational Environmental Satellite Goddard Space Flight Center (NASA) High Resolution Imaging Spectrometer Input/Output Internet Protocol Information Systems Office (NASA) International Standards Organization Jet Propulsion Laboratory (NASA) Ki~obits per second Megabits per second Megahertz Million Instructions per second Mu~tispectra] Scanner Navigation and Ancillary Information Facility Navigation and Ancillary Information System National Aeronautics and Space Administration National Center for Atmospheric Research National Oceanographic and Atmospheric Administration National Research Council 54 l

NSF NSSDC OAST OSI OSSA OSTDS PCDS PDS PI PLDS PODS PSCN PSU R&D REP ROM SAIS SAR SPAN SSB SSIS TCP/IP THIR THIS TM TOPEX TP-4 UARS USDA USGS VHSIC National Science Foundation National Space Sciences Data Center (NASA/GSFC) Office of Aeronautics and Space Technology (NASA) Open Systems Interconnection Office of Space Science and Applications (NASA) Office of Space Tracking and Data Systems (NASA) Pilot Climate Data System Planetary Data System Principal Investigator Pilot Land Data System Pilot Ocean Data System Program Support Communications Network (NASA) Physical Storage Unit Research and Development Request for Proposal Read-Only Memory Science and Applications Information System Synthetic Aperture Radar Space Physics Applications Network Space Science Board (NRC) Space Science Information System Transport Control Protoco1/Internet Protocol Temperature-Humidity Infrared Technical Management Information System (NASA) Thematic Mapper Topography Experiment for Ocean Circulation Transport Protocol 4 Upper Atmosphere Research Satellite U. S. Department of Agriculture U. S. Geologic Survey Very-High-Speed Integrated Circuit 55

Critical Issues in NASA Information Systems: Final Report to the National Aeronautics and Space Administration Get This Book
×
MyNAP members save 10% online.
Login or Register to save!
  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!