Click for next page ( 42


The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 41
VI. INFORMATION SYSTEMS TECHNOLOGY Over the past quarter century, OSSA has employed a mission-oriented approach to the collection of data in support of a variety of science applications. During each mission, data were collected to meet the requirements of a small group of scientists in a particular discipline, using a data system that had been developed for that purpose. However, as noted earlier, there is an increasing need for interdisciplinary and multi- discip~inary scientific work that will change the way information systems are structured. OSSA knows that as it moves into the Space Station era mission and discipline boundaries wild blur, and huge volumes of data will be collected by NASA and others to support a large number of interdisci- plinary projects involving hundreds of scientists. OSSA has already initiated comprehensive planning for such missions and for the challenge of information systems that can handle the huge volumes of data and the product requirements of the users. As an example, the Committee is concerned that even with efficient data-rate management and control, the current digital magnetic recording and compact disk (CD) read-on~y memory (ROM) technologies cannot cope with anticipated data rates in the Space Station era. Further, commercial database management systems currently do not have the features required to manage large volumes of space-derived data. These technological problems are compounded by such management and operational considerations as the need to control costs (which potentially affects OSSA's ability to support the users) and the need to support the users (which influences costs). Therefore, the Committee suggests the following as the final major issue to be addressed by OSSA in the context of this study: Issue #4: How can the projected information systems technologies keep pace with future sensor outputs? After reviewing the technology requirements of NASA information sys- tems in the Space Station era, the Committee believes the specific areas of technological concern are: 41 i

OCR for page 41
the trend toward development of higher data-rate instruments for use in remote Earth sensing, taking into consideration the actual user need and constraints imposed by information systems techno1- ogy and costs (discussed below); the ability of current digital magnetic recording and compact disk (CD) read-only memory (ROM) technologies to cope with anticipated data rates in the Space Station era in support of on-board process- ing, space-to-ground transmission, "1eve1-zero" processing (that is, data that have been corrected for telemetry errors and decom- mutated), and the storage and retrieval of data (discussed below); the ability of commercial database management systems to manage large volumes of space-derived data (discussed below); the need for cohesive planning and a unified approach to the creation and control of software (discussed in Section Ill of this report); and the fragmented and mostly incompatible data transfer and electronic communication between elements of the OSSA and the user community, which makes data and information transfer difficult (discussed in Chapter IV). The Trend Toward ~ . Scie ~ much data they can effec- tive~y evaluate. Most users will want data over a small test site or a sampling of the data to meet their scientific needs. The Committee does not believe information systems should be designed to provide all data acquired by the high data-rate instruments, unless there is an overwhe~m- ing scientific justification. A careful cost-to-benefit analysis should be made before designing a data system for the high-data-rate instruments. Some data sensors have the capability of drowning data systems with so much data that costs become unreasonable and technology may not even be able to cope with the data stream. If all data is saved, one cannot afford to extract the information that is really needed. Several sensors such as the HIRIS and the synthetic aperture radar (SAR) that will be part of EOS dominate the planned data volume environment. Technology improve- ments should enable NASA to increase the cost effectiveness of handling new data by a factor of ten by 1995. However, it appears that data gath- ering may increase by much more than the technology gain unless careful plans are made for use of the high-rate sensors. The data rates for SAR tabout 300 megabits per second (Mbps)] and HIRIS (up to 900 Mbps) compare with data rates to good computer disks of 24 Mbps, and Cray supercomputer specia1-channe] speeds of i,OOO Mbps. With data rates even a fraction of these, one must establish a mechanism to cope with questions of what sampling and data archiving make sense. The strategy should include a projection of year-1995 technology and costs, and an effort to drive data storage costs down. With lower storage 42 1'

OCR for page 41
costs, it is reasonable to save more data that will be used for local studies and case studies for short time periods. Table ~ and Figure 4 summarize data rates from selected instruments during the next 10 to 15 years (also see Table 2 and Figures 5 through 7 at the end of this chapter). Another class of studies of growing importance requires processing data over a number of years. If all OSSA does is to save high-volume data for many years, it still cannot be used for such studies because it costs too much. Often data need to be sampled in several ways, such as the one- kilometer (high resolution) and four-kilometer resolution (global survey) data that is routinely supplied to NOAA. A common satellite data rate of 100 ki~obits per second (kbps) pro- duces 3,160 x 109 bits per year, or 3,160 high-density tapes [6,250 bits per inch (BPI)] each year. An individual PI usually can cope with only 20 to 100 tapes per year. A data center usually charges $60 to $100 per tape copy, and then it often costs the PI even more to process it. The International Satellite Cloud Climate Programme is now sampling data from several geosynchronous-orbit satellites and one polar-orbiting satellite to reduce the archive from about 60 x 1012 bits per year to two archives, one of about 500 tapes (500 x 109 bits per year) and the other of about 100 tapes per year. The international processing unit at the Goddard Institute for Space Studies is able to process the smaller of these two archives to derive cloud statistics. While the above data rate of 60 x 1012 bits per year has posed a difficult problem for long-term studies, it should be noted that the composite data rate being planned by NASA for 1995 is more than 50 times greater (see Figure 7~. In EOS, a NASA division proposed to limit the onboard system to hand aggregate instrument rates not over 20 Mbps. The limit is under debate. Other very-high-rate sensors such as SAR and HIRIS wild be handled sepa- rately. The Committee thinks this NASA strategy is wise. The high data rates demand more careful attention to decide what sampling strategies and data rates make sense. The main uses for SAR are ocean wave statistics, ice coverage and location, and J and resource studies. To obtain ocean waves, one needs only a small, square array of samples located 100 or 200 km apart from each other, perhaps closer together in coastal waters or near a major storm. As indicated above, it seems likely that user requests for HIRIS channels will be rather modest compared with the capa- biJity now being planned. The HIRIS instrument has similarities to instru- ments on Landsat and the European SPOT. Comparisons should be made with the data rates, duty cycle, archive strategies, and costs of these older systems, as part of the process of defining the data system for HIRIS. In forming sensing requirements, it would be helpful if OSSA would provide feed-back to the users on the costs for different options in order to arrive at a good balance of costs and benefits. also, the plans for future data rates and archives should factor in better technology. It is 43 1 ~ le

OCR for page 41
anticipated that there will be an increase in storage cost effectiveness and of computing capability (per-unit cost) by a factor of 10 or 15 by 1995. However, one cannot plan for 100 times more archiving by 1995--when technology is projected to be perhaps only 10 times better--without carefully evaluating costs and benefits. Since the Committee did not have the time to study this matter in great detail it can do no more than suggest that it be given careful review. In particular, the Committee is concerned that the likely budget cuts in the foreseeable future will mean that increased funds for sophis- ticated, and therefore expensive, information systems will come at the expense of investigator, instrument, and spacecraft portions of the pro- grams. When data requirements are being discussed, there will always be some good reasons for better space resolution, more samples in time, and more channels. However, users do not need the highest-resolution data al] of the time. We believe that achieving a balance between data and infor- mation systems and other aspects of the programs is essential. Limitations of Current Digital Magnetic Recording and Compact Disk . (CD) Read-On~y Memory (ROM) Technologies. Even with efficient data-rate management and control, the current digital magnetic recording and CD-ROM technologies cannot cope with anticipated data rates in the Space Station era. OSSA needs to examine and support, to at least a limited extent, the development of alternate storage technologies, to support high throughput rates and capacities. Hybrid analog and digital recording formats and optical video disks similar to laser-vision disks are examples of a~ter- nate technologies that can be exploited. A careful examination of continuous-throughput data-rate requirements for high-data-rate sensors is needed to reduce data volumes to a manage- ab~e level that is both consistent with user requirements and affordable. OSSA, in conjunction with the user community, should develop techniques (including data compression and on-board data extraction techniques) to reduce the data throughput requirements to a J eve] consistent with contemporary technologies that are commercially available or expected to be developed commercially in the near term. In reviewing the requirements of the first three technologies listed on the proceeding page, the Committee adopted the assumption that contin- uous throughput requirements will vary from lob to lO9 bits per second (bps) in the 1990 time frame (see Figures ~ through 7 and Table 2 at the end of this section). The focus is on continuous rather than burst data rates, since the tote] cost and complexity of the information systems will to a large extent be determined by the continuous throughput requirements. For data rates up to lob bps, technology currently exists for space-to-ground transmission, and for processing, storing, and distributing data electronically to most users. At this rate, data can be processed ([eve] zero), archived using magnetic media, and distributed to users in read time using commercial transmission facilities. OSSA missions with non-imaging sensors or low-resolution imaging devices have 44

OCR for page 41
continuous throughout rates of the order of 106 bps. These missions generate up to 1o1 bits per year, and the data can be stored in about 10~000 physical storage units (PSU) such as tapes' disk packs' etc. Input/output (I/O) rates of 100 bps are easily available with tape and disk drives, and communications links operating at 1.544 Mbps (commonly called "T1" links or carriers, in reference to their commercial tariff designation) can be established easily at user locations for data distribution. Processing speeds of 10 million instructions per second (MIPS), or up to 100 instructions per byte of data, will be needed for level-zero processing. Such speeds are currently available. Increasing the throughput requirements to 107 bps will stretch the current capabilities in some areas. One exception is the space-to-ground link, in which capacities of 100 Mbps are currently available. While magnetic recording media can handle I/0 rates of 107 bps, the annual volume of 1014 bits will require over 100,000 tapes per year (CD ROMs cannot handle input rates of 107bps). Near-real-time processing and distribution of data to users still might be feasible as long as a single user does not demand access to all the data over extended periods of time. Processing speeds of 100 MIPS to handle level-zero processing, as well as storage requirements of over 100,000 PSUs per year, present some major problems using projections of current technology. Data rates of the order of 108 bps present possibly insurmountable problems and challenges. Processing speeds of over 1,000 MIPS, and I/0 rates of 100 Mbps into and from storage media, are difficult to achieve unless parallel-processing techniques are used. Even then, the number of PSUs will be of the (unmanageable) order of 106 units per year. Near real time distribution of data to users may not be economically feasible at these rates. We do not anticipate an exponential growth in the I/0 rates and stor- age capacities of magnetic media (or CD ROMs), or throughput rates of con- temporary production networks. Specially designed multichannel magnetic recorders or very-high-speed integrated circuit (VHSIC) memories may pro- vide a means to capture and process short bursts of data at rates of 108 bps. However, current technology, as well as what is projected to be available in the time frame being considered, cannot support the process- ing, storage, and distribution of data at sustained rates of 108 bps or higher. The need for data rates of 108 bps or higher originates from high- resolution imaging sensors, such as multichannel spectral scanners (MSS), thematic mappers (TM), and synthetic aperture radars (SAR). There are two possible solutions to the probe ems created by these high-data-rate sen- sors. First, image data is highly redundant and data-compression schemes can be used to reduce the data rates by almost one to two orders of magni- tude. Commercial coder-decoder (CODEC) devices are currently used in a variety of applications for data compression and reconstruction. In NASA systems, compression may take place on the space platform or on the ground where the level-zero processing is done. Fairly simple spatial and 45

OCR for page 41
spectral compression algorithms can be applied to data streams of BOB bps to reduce the rate to lob tO 107 bps. While compression algorithms have been developed and applied to MSS image data, new algorithms need to be developed for data from SAR and other sensors whose statistics are quite different from those of MSS data. Once successful algorithms are applied to the outputs of high-data-rate sensors, the resulting reduced data rates can be handled with existing technology. The development and application of data-compression techniques should be coordinated carefully with the user community, which traditionally takes the view that nobody should "mess around" with the data. They should be convinced that some trade-offs have to be made in order to maintain high throughputs over long periods of time. If the option to transmit uncompressed data over short periods of time, when needed, is maintained, the Committee believes that users can be convinced to accept compressed data (user involvement was discussed in Chapter V). The data-compression issue may have to be looked at in the broader context of data or bandwidth management. Issues such as compressing data onboard versus compressing it on the ground, and using an "expert system" onboard to extract information and make decisions about how much data from each instrument to transmit to the ground, need continued study and analysis. At the higher data rates (>108 bps), the onboard processing requirements to implement any kind of "expert system" might require processing speeds in excess of I'000 MIPS and may not be cost-effective. The cost trade-off between introducing additional processing requirements and savings that might result from reduced costs for storage and distribution must be analyzed carefully. An alternate approach is to consider analog (or hybrid) recording techniques for storage purposes. Consider, for example, a standard TV signal which has a bandwidth of about 5 Megahertz (MHz). If this signal is digitized, the data rate required will be of the order of 108 Mbps without compression. Digital recording at this rate for as little as an hour wild produce hundreds of digital magnetic tapes. However, several hours of the analog TV signal can be recorded on a single $4 VHS tape with a $200 recorder! Now, while digitizing facilitates easy multiplexing and transmission over long and noisy communication links, there are no signif- icant advantages that warrant digital recording. The CD ROM technology does not provide any attractive solution to high-volume, low-demand applications. It is most effective for low, continuous throughput and high demand (several hundred copies distributed) applications. The Committee sees promise in the use of commercially available recording technologies such as Jarge-bandwidth analog, hybrid magnetic recording, or optical technologies. While analog or hybrid recording using magnetic tapes provides high throughput and capacities, random access to recorded data is not yet possible. Laser-vision and Jaser-video disks offer capacities and throughputs that are much higher than those of CD ROMs. Even though the throughput and capacities of laser-vision and laser-video disks may not be as high as analog magnetic tapes, they do 46

OCR for page 41
provide random-access capabilities. The throughput and capacities of video disks are an order of magnitude higher than those of CD ROMs; hence, the video-disk technology should be monitored. Limitations of Commercial Database Management Systems {DBMS). Based on briefings from NASA personnel, the Committee understands that during the next decade, NASA's mission-specific data systems will be replaced by more generic, multi-discip~inary DBMSs. Data systems can be characterized as those where the users of the system are responsible for providing all desired management of the data, whereas DBMSs provide generic management capabilities as an integral part of the database system. The commercial world successfully underwent this transition some years ago, and it is evident that the engineering and scientific worlds are undergoing a similar transition today. Equally important, major standardization activities relative to DBMSs and associated capabilities (e.g., query languages, report writing facilities) are gaining in momentum. The advent of relationa1-based systems has been a major factor in the drive toward standardization and wild provide a vendor-independent base for future database management systems technology. The Committee also believes that OSSA and its constituent program and project offices should focus on using, to the greatest extent possible, commercial~y-available DBMSs or derivatives thereof, rather than spend excessive amounts of resources in developing their own. However, while commercially available DBMSs will provide a comprehen- sive set of data management facilities, there remain a number of areas in which these systems fall short of meeting the needs of the engineering and scientific communities for management of large volumes of space-derived data. In conjunction with NOAA, NSF, and the community of vendors and standards organizations, NASA/OSSA should focus on this shortcoming, and encourage the private sector and the standards organizations to develop appropriate solutions. Some of this is already being done: the agreement reached between NASA and NSF in NSF's supercomputer initiative is a major step in this direction. Many of the supercomputer centers wild be extend- ing commercially available database management systems to provide those facilities required for the target engineering and scientific communities. Additional efforts of this type are required. The Committee believes the major areas to be addressed are the following: l. Performance. Much of the past reluctance of the engineering and scientific communities to adopt commercially available DBMSs has been the Jack of numerically intensive computational performance available through the use of these systems. There has been an acceptance of this deficiency and much work is now underway to provide the necessary levels of performance. OSSA and its constituent user communities should quantify their performance requirements and make them known to vendors and other interested parties (e.g., the NSF). 47

OCR for page 41
2. Very large databases. Closely associated with the performance question discussed above is the question of the ability to handle very large databases. Traditional, commercially oriented DBMSs have not proven themselves to be particularly well suited to dealing with the massive amounts of data that normally are dealt with by the engineer or scientist. However, this shortcoming has indeed been recognized and much research is improve the ability of DBMSs to deal effectively databases, either directly or through the use of processors. currently under way to with very large auxiliary 3. Data definition capabilities. Commercial DBMSs have focused pri- marily on data-definitional facilities that have been oriented to the commercial world. These have proven not to be adequate for the engineering or scientific user. OSSA should understand better the needs of its user base in this area and transmit those needs to the appropriate standards organizations and vendors. 4. Data interchange. To achieve even a primitive level of interoper- ability, data interchange agreements must be formulated and agreed upon. These agreements or standards must be as non-constricting as possible; therefore, the Committee recommends that these stan- dards be based on the notion of self-defining data (that is, data wherein the definition of the content of the data record is con- tained within the record itself). While we saw some indication of a beginning of this in the EOS project, it needs to be focused upon on a much broader base with a much higher assigned priority. 5. Directories and carats. The Committee has previously noted in this report the central rode to be played by directories. We believe that effective and efficient directory management capabi, ities (including abilities to these directories) wild be a key factor in achieving systems interoperability. User requirements for both directory content and directory management should be gathered, analyzed, and submitted to vendors and appropriate standards organizations for consideration and adoption. 6. Distributed Systems. It is inevitable that NASA scientists will be involved at a globe] level with a hierarchy of systems, with much distribution of both data and processing being both desirable and necessary. Fundamental architectural decisions, accommodating heterogenous systems and vendors, should be dealt with immediate- ly. For example, will control information and responsibility be centrally managed or distributed? What will be the capabilities for shipping data to work and/or work to data? 48

OCR for page 41
PEAK DATA RATE TIME PROFI LE 1.1 1.0 0.9 on as Cal 3 o - ~ 0.6 - a: A: n 0.8 0.7 0.5 0.4 0.3 0.2 0.1 O I L// ; AudioMdeo Telemetry - ~ ~, ~ 1992 1993 /~/~ 1 994 1995 YEAR FI CURE 5 49 1996 1~ .;,

OCR for page 41
AVERAGE DATA RATE TIME PROFI LE 400 350 Q 300 - US C] hi: LU - 250 200 150 100 50 o ///~] Audio~ldeo Telexed _ _ : E~4 1992 1993 it. .;. .;... ..... :: .-. :.:.-. ::::::: . . .. .. . :. :~ ..... ....... ..~ ...... , l 1994 1995 1996 1997 1998 YEAR FIGURE 6 50

OCR for page 41
DAI LY DATA VOLUME TI ME PRO FI L E 4.0 3.5 On 3.0 m t o 2.5 2.0 At: 1.5 o 1.0 0.5 o ~///~3 Audio/V'deo Tel~W - 1992 1993 1994 1995 1996 1997 1998 YEAR FI CURE 7 51 :....--. .~ : :.: :~ ,~ .,

OCR for page 41
- ~ lo ~ lo lo Jo lo lo lo ~ lo lo lo =-o lo lo ~ ~ - - To CS) C~ C~ C~ O ~ ~ ~ 00 ~ O C0 ~ ~ O ~ ~ ~ LO ~ C~ ~ ~ ~ ~ ~ ~ 1 CY) CS) ~),d" C~1 00 ~ CXJ ~ ~) ~I d" C5~ l O d" m) ~) LO ~) ~ Cx1 <5) ~ LO - 1 - 1 I N ~ ~) ~ LO C0 ~ ~ ~ C~ ~ ~ C~ CO ~ __ __ r~ 0 ~ 0 0 0 0 0 0 0 0 0 ~ 0 0 0 LO ~ ~ O ~n c~ 0 ~ ~ ~ co 0 oo LO =) ~ ~ a ~ ~ a ~ ~ 1 O C7) LD 1~ J a) ~) e~ ~) ~ d" ~ C~ l O ~ O O CN1 IS~ ~1 1:~1 CS) C~ LO C~1 CNJ ~ CN1 ~~) 00 C~l C~ ) ~I d" CNJ ~1 LO C~l . 1 ~ ~ ~) O ~ O O O O O O O O ~) O O O LO d" 1 O ~1 C~a CS~ CN1 0 C~ ~1 00 0 00 ~) (~a O C0 d~ ~1 LO ~1 C~l C~ C7S a a! a a a ~ 1 LO O) ~ 1~ ) ~ ~ ~1 ~ C~ C~ J O d" 1 LO LO I CN.1 C5) ~ CN1 C~ ~~ ~) oo 0 ~) CY) .~ . ~ Q~ a~) O =) O O O O O O O O =) O O LO e t" _ ~1 <5) 00 O) C~a O ~) ~1 C O O 00 ~ ~a 00 ~ ~I Lt.) I~ O C~ ~ C~ e a a a 00 all ~1 O CS~ {S) I~ C~l 00 kD ~ ~ ~ ~ ~I C~l O d" {~ L~1 C~1 LO ~1 O) L) CXJ CN1 1 rC~) LO O ~) 00 C~ ~ C~l ~1 C~l 1 d~ 00 O .. . . ey d" O ~) O O O O O O (D O O Ln ~ ~1 O ~1 O =1 O CS! C~l O ~ O C: O ~) C~a CO ~ ~I LO O O O CD a a a a a-LI I ~ ~ ~ I~ C~ ~ ~ d~ N N O ~ 00 ~ C~l a~ ~ ~ Ln CN1 CNJ I 1~ ~) ~1 00 C~ a ~1 __ __ __ _. O O O ~ Lt~) 1 O ~1 cn ~ oo 0 ~ c~ ~ LO 00 r~ a~l ~ a a a ~ a I 00 I~ N d" O C~l 00 ~) ~ r l ~1 ~ I ~ I C~a O O O O O O C~l CI5 CS, 00 0 ~) . 1 00 C~ N _ ~ ~ ~ ~ . ~ a O, l~l I CO ~ N 00 C0 CS) ~ ~ ~1 I I ~ =) _ ~ - O ~ ~) ~) O O O ~1 a~ O t~ O O O CNJ O O O I N ~S 41) ~ r ~) L~) L~) O C~l I~ O d" O O O ~) LO O C7) O d" ~ ~ ) Q t~ a ~ a a ~ ~ a2 ~S ~~S ~ f~ t~ <~ ~1 (~a 00 0 0 (S) O L(~ 1 1 ~ N O O a c~ ~ CY S r ~) I ~ - 1 ~ . I _a ~S a~ ~ ' >) a? ~ O 1 - 0 w) LO ~I LO C~) C~) O O O ~ C+) O O . 1 ~1 ._ ~ ~ 1~ o ts) LO N ~1 N 1 C~J N O O O ~ N ~1 ~1 N N S 5 t_) _ a C) Ct ~ C~ ~ O O O O O O O 0 ~1 ~1 ~1 0 0 0 0 0 0 C~ Z 0 ~ ~ C) ~ O) 0 0 0 0 N O O O O O O O O O O ~) O ._ nt ~ ~ (J) , O O O O O O O O O O O O O O O O O ~) V' Cf) C~) O) ~ ~ Q Q ~ a ~ LLI I~J l~a Cf' ~ C) ~ =2 ~S O ~) O O ~l ~ O N LO ~ I ~I C~J O O CS) O N 1~ ~ ~ || I LC) ~I N ON || ~ || ~ | =] | i ~ %_ __ __ l=J c S: a~ O >) ~ ~ r- ~ ~ ~ a2 a~ ~ a~ al5 ct <~ a~ ~ 6~S ~ ~ a~ ~ a~ a2 a~ a~ O O <1) t~ C/) C~) ~ t~ ~) t~ a2 c~ C/) ~ ~ a~ ~) C~ C~ t~) C/) llJ I~J 1 Q O) ~ ~ ~ ~ ~ ~ CC O O a~ O cl O a2 a~ a" ~ a~ ~ S. ~ a2 ~ Z: Z Z ~ Z Z: Z ~ Z Z Z Z ~Z Z Z Z ~ ~ ~ ~ a2 O) ~ ~ ~ ~ L~ ~ CL CL ~ ~ O O O a~ l O O O O O O O O O O ~ CL O) ~ ~ ~ ~ c~ ~ cL ~ ~ Q ~ O O t~ ~ C~ C~) a~ aS a2 >~ ~S ~ ~ ~ a2 Cf) ~ ~) t' CJ) {f) U) ~ ~ ~ J Q G) ~) CJ) C/) ~ C/) ~ C/) ~ ~) O O O O O O O {~ C/) ~ a~ ct '_ 1 1 m- 1 o 11- 11 o 1 11 ~ a) ~ ~ ~ O) r- ~ r- ~ ~1 C] a O O I a" ~ C~ ~ Cf ~ 1= ~ O . r ~ a~ ~) ~ ) I I ~ ~ ~_ t~ 11 - a~ ~ ~S O O ~ I ~ ~ O) C~ t~ ~ ~ CY O! a~ a~ ~ ~ ~ r S~ ~ ~ I~ ~ O) (.~ a~ LLI Q O ~ I ~ ~ a~ I_ I I X l_ O ~ O) ~ a2 c~ a" ~ C/) I ~ ~) a~ a~ ~ C' ~) I 1~ ~ ~ ~ ~ 1~ O . Z 52 . N CM O N ~n a) ~ ~s a) a, ~a o __ X o o X a) C) a, a) ~a ~: 11