Click for next page ( 21


The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 20
Chapter 3 TRENDS AFFECTING THE BASE LEVEL AUTOMATION PROGRAM v at. The conclusions presented in the earlier parts of this report derive from the committee's review of the Phase IV program and from its judgments about some important technological trends and their applications to automated data processing. This chapter describes some of these trends and their implications more fully. A central trend is the continuing rapid decrease in the physical size and cost of a machine to deliver a given amount of computing power. Not only are computers' direct costs dropping, but also with decreasing physical size come dramtic savings in other costs: space, power, cooling, and physical security. Examples, some forecasts, and further implications are given in section 1 of this chapter. Changes in the pattern of costs have brought about the development of specialized computers and their supporting software. Examples are Intelligent terminals that permit relatively untrained users to enter data or frame queries, and "front end," or communications, processors that facilitate the operation of systems having multiple terminals or processors. These are noted elsewhere in this report. A further important example, that of the specialized data-base management system, is discussed in section 2. Given more computing power and speed, systems can be designed to respond more flexibly to users' needs and to maintain contact with multiple users simultaneously. For these reasons, a design philosophy embodied in so-called transaction-oriented systems, discussed in section 3, has developed. Out of the trends described above has come a philosophy of design for automated data processing systems that can support administrative and management functions in large organizations. Computers, user terminals, and specialized processors such as data-base machines can communicate interactively under control of their own programs and according to instructions from the terminals. The facilities themselves are located wherever economy or convenience of operation dictates. Facilities are shared by users when sharing is economical, or for such specific functions as are most economically shared. Where economy or operational factors dictate, some users or functions may be served by dedicated machines. Software is written so that the user need not be concerned about the physical location of the computing _ , ~ . . . . 20 1'

OCR for page 20
21 facilities. A brief discussion of communications facilities for such a system appears in section 4. An illustrative system is sketched in section 5, based on typical practices in business organizations. Parallels with the Air Force are noted. With the decreasing costs and increasing capabilities of computer hardware has come strong pressure to reduce the labor costs and time delays involved in writing programs. The art of program design has matured. Furthermore, a family of specialized computer systems and facilities has emerged to support program design and development and to improve the productivity of programmers. Those matters are discussed in section 6. Small computers based on microprocessors are becoming available in a wide variety of sizes, prices, and capabilities. It is inevitable that these will offer substitutes for or additions to the services that the present base level system provides. Some implications are discussed in section 7. Finally, section 8 comments on the differences between wartime and peacetime operations, and on their implications for the evolution of the base level ADP system beyond Phase IV. 1 . SMALL HIGHPERK ORMANCE SYSTEMS Techniques for fabricating the semiconductor devices that stand at the heart of every computer are expected to continue to evolve during the 1980's. Speed of operation will increase, and the number of active elements or logic devices that can be put on a single silicon chip will grow dramatically. By the late 1980's, it should be possible to put a million or more active elements on a single chip a few millimeters square, as compared, say, to a few tens of thousands in today's commercial practice, or to a few hundred ten years ago. Fabrication at these high densities is known, somewhat loosely, as very large scale integration {VLSI). At such high densities, a complete central processing unit of 250,000 gates, complete with memory, power, and input/output lines, could be scaled down to a few chips. Further, if the chips used gallium arsenide as their primary semiconducting material, they could operate at a speed approximately an order of magnitude faster than those constructed by present techniques. Alternatively, if VLSI were used to implement the so-called Josephson junction technology, a complete central processor and expanded main memory, which now occupy several full size cabinets, could be reduced to 40 cubic inches. The resulting package, which would not be much larger than a hand-held calculator, could execute some 250 million instructions per second. It seems unlikely that current designs of the large mainframe computers will continue to be made in smaller and smaller physical sizes. In the near term, advances in VLSI technology will probably provide another generation or two of more powerful, smaller, and more cost-effective processors. Over the longer term, however, large systems will undoubtedly evolve in a different direction, probably toward arrays of specialized devices functioning in parallel. Many

OCR for page 20
22 current limitations on information formats and on addressing will then virtually disappear. One of the continuing beneficiaries of the reduction in cost per unit of operation will be the growing line of personal computers. Present systems use microprocessors and are available with printers, keyboards, and full lines of software. They sell for less than $5,000. Of this, only about $1,000 is for the central processing unit; the rest is for the peripheral equipment. With its software, such a system can handle the needs of a small business. Significantly more powerful systems, which should sell for prices only a few times greater, have been studied experimentally for commercial development. These systems have the power of processors currently defined as small or medium in scale. (Small systems execute instructions at up to 200,000 per second; medium systems, at up to 1 million per second.) As these small high-performance systems reach the market, they should develop much as hand calculators did during the 1970's. As their usefulness becomes apparent, cost becomes considerably less important in deciding whether to buy them. For many base level applications, the economies of scale that once led to a single, central computer system will be gone. A great deal of flexibility will be possible in matching equipment to its intended use. Freed from the constraints of hardware, planning is likely to focus on user needs. Other benefits may also flow from the use of multiple, inexpensive, high-performance systems and subsystems. An Air Force base that can process its own critical applications is less vulnerable to attack than one that depends on a regional connection. Even within a single base, several systems deployed over a wide area would be less vulnerable than a single centralized system. Because costs are low enough, redundancy can be built into survivability and backup planning. Security and privacy are usually easier to handle on dedicated systems or subsystems than on those shared with many general users. Also, small systems are moved easily and can be powered by small generators, batteries, or even solar cells. 2 . DATA BAS E MANAGEMENT SYSTEMS Over the last few years, the data processing industry has developed systems that use specialized data base management techniques and software. The results are clear: data base management systems (DBMS) can significantly benefit users, programmers, and maintainers. DBMS software that is currently available for all large mainframes and for many minicomputers often includes software for managing data communications, for inquiry and retrieval, and for assisting software development. Already such DBMS software is proving to be stable, efficient, and economical of resources; some form of it will probably become standard in future large systems. Since its beginnings in the early 197O's, data base technology has grown rapidly. Its development has been encouraged by declining costs for random-access storage devices and by a recognition that

OCR for page 20
23 integrating all data being handled by individual programs into one common pool can often improve efficiency. Today, data files at each Air Force base are associated with specific programs that process them in accordance with the organization of the records in the files. ~ given data element (e.g., a supply stock number, aircraft tail number. or Air Force specialty code) may exist in many files; it must be altered in each separate file whenever a change is made. Any format change (e.g., switching from a numeric to an alphanumeric designation, or changing from a five- to a six-digit identifier) requires changing every program code that uses the data element and rewriting all files with the new record structure. Were the data in a common file, accessed by all programs that use it, the overhead in maintaining the common file could be significantly less than that in carrying separate files for each program. Contemporary data base management systems are characterized by (1) integrated collections of data available to wide varieties of users; (2) data definition in the form of centralized and accessible descriptions of the names of all data elements, their properties (such as character or numerical type), and their relationships to other elements--relationships that can involve complex groups and hierarchies; (3) centralized control over data descriptions by data administrators; and (4) compatible inquiry and retrieval languages that can specify a variety of logical operations and output formats, thereby permitting nonspecialists to write simple retrieval and reporting programs for their particular requirements. Data definition is the cornerstone of DBMS. By controlling definition, data administrators can successfully adopt a family of programs using shared data. The data administrator can publish a data dictionary, available to all programmers, that contains the names, formats, definitions, and relationships of all data elements. In most systems, programmers or ad hoc users can have access to such dictionaries to determine correct data names. In a well-designed DBMS, neither programmers nor ad hoc users of query systems can alter the physical or logical relationships among data elements. Programmers are prevented from defining new elements to be included in the data base, and all programs '~se common data definitions. Any new program can then retrieve or update data easily, formats can be changed without affecting multiple files or programs , and storage and retrieval mechanisms, hidden from programmers, can be made more efficient. A well-designed DBMS also serves to maintain data quality. Failure or inadvertent misuse of a processing program may result in incorrect values being written into the data base. This error must be corrected or its effects undone, lest other programs use the incorrect values. Back-out facilities for doing this are usually provided by modern DBMS packages, along with complete facilities for producing and reloading safe copies of backup files. Considerations of privacy and security are somewhat more important in data base management systems than in those in which files are maintained by individual programs. Because all users know that shared

OCR for page 20
24 data exists, a DBMS must be able to protect the data from unauthorized access or disclosure. Privacy is usually achieved through a security mechanism, typically a password. Data administrators determine which users have access to what data. Audit trails are maintained to identify who gained access to what data under what conditions. Some of these techniques are at present only partially developed, but they are certain to receive a great deal more attention in the future development of DBMS software. For the purposes of the Base Level Automation Program, data base management systems can greatly reduce programming and maintenance costs. Much of the source code of a typical COBOL program consists of definitions of data elements being read and produced by the program. When a data base is used, the DBMS software conveys data definitions to the program through a variety of linkages. When data descriptions change, all using programs can accommodate the change with little effort. Ad hoc query languages supplied with most contemporary DBMS packages can contribute significantly to a system's utility. In the current Phase IV system, when a user requires a new report to be produced from existing files, a programming effort is frequently required. Under a DBMS, simple requests may frequently be accommodated without programmers, since the query language Is usually an English-like representation of logical relationships, simple arithmetical operations, and report format descriptions. Users may be able to train low-level programmers to meet many of the ad hoc processing requirements that today go unfilled due to the shortage of experienced programmers needed for the simplest of programs. 3. FROM BATCH TO TRANSACTION PROCESSING Managers have always used periodic ~snapshot" reports on, for example, the states of inventories or cash balances. Such reports have generally been the best available, given the accounting and reporting procedures with which managers have had to operate. Thus, current base level computers are so configured that many of the important operations are done by batch processing. Consequently, files and reports drawn from them are current only at intervals. Modern ADP systems, like those the Base Level Automation System hardware with appropriate terminals and software can create, have sufficient flexibility and power so thank files can be continuously updated. Each transaction can be taken care of as it occurs (hence the term ~transaction-orientedn). Moreover, the current status of any particular balance or inventory can be reported on demand. With large memories and rapidly accessible mass storage (on-line discs and so-called "virtual memory" techniques), and with well-designed data base software, the computer can immediately evaluate and respond to a new entry and update the data base accordingly. Indeed, if data are captured at the source by a monitoring or sensing device connected to the computer, the status of that particular application can be kept current in essentially Real time.

OCR for page 20
25 Transaction-oriented systems have changed the ways many businesses operate, for example, by reducing requirements on inventories and cash balances, by conserving managers' time, and by increasing the efficiency and productivity of workers whose output depends on information derived from the data system. The Air Force could realize these advantages by moving toward a transaction-oriented system. At the same time, it could reduce clerical and other labor costs now required for batch processing. The Committee understands that the Air Force has been moving from batch to on-line operations for such functions as accounting, maintenance, and personnel. The transition will make transaction processing possible for such functions when the Phase IV system begins operation. 4. LOCAL COMMUNICATIONS SYSTEMS A variety of developing technologies can provide flexible communications links within an air base. Perhaps the most readily available, taking into consideration existing facilities, is a computer-controlled telephone switching sytem. These private automatic branch exchanges (PABX) transmit both voice and data over the same wires. Regardless of whether the information is converted to an all-digital or an all-analog format, the computer that controls the switching function provides a flexible communications medium. Voice and data equipment are easily connected and accessed from any point in the system, and are readily linked to external telephone networks. Although limited to data transfer rates typical of terminal devices, a modern PABX is easily implemented and can share its investment costs with a voice system. Local transmission systems that handle data at rates much higher than those accommodated by telephone lines are now commercially available. Both point-to-point and netted systems are possible. Private industry is developing interface and protocol standards to control data transmissions in such local systems. As these standards become more widely accepted, more product and software support will become available for connections to conforming systems. Location, function, and speed will not be difficult problems in establishing communications between machines. These developments will considerably expand the definition of data processing systems, because access to the systems can be made available almost anywhere that it is required. Communications can effectively serve data entry devices. In many present base~level systems, information is transferred in a series of steps from its sources to punched cards that are eventually entered into a computer. The process is slow, subject to handling errors, and rigid in format. For less than the lifetime costs of a keypunch, terminal devices for entering data can be located whereever needed and connected by a local communications system directly to the computer. And because of the low cost of microprocessors, the terminals can process data locally as it is entered, for example, checking for format conformity or data value ranges.

OCR for page 20
26 While most logic and memory devices continue to drop in price, those that connect operators to computer systems are not decreasing in cost as quickly. High-speed printers or plotters, for example, are built out of parts from technologies that are developing at different rates. For these, equipment costs remain high. However, easy connection to local communications systems provides a means of sharing costs and capabilities. Advances in local communications allow for a broadened concept of computer systems and a larger community of users. The benefits of automation can now be extended to whole processes rather than being restricted to individual intermediate steps. 5. THE DISPERSAL OF PROCESSING POWER With computers available in a wide range of sizes and speeds, with a variety of convenient terminal devices, and with facilities for communicating among these under computer control, organizations can fashion automated data processing systems to fit their operational and organizational requirements. Many companies have developed their own approaches, but certain general characteristics can be observed: There is strong interest in capturing data at its source, automatically if possible, or as an automatic feature of other operations that must be performed in any case. Data terminals reduce the processing and communications load on the rest of the system by checking and editing data and putting it into a standard format before transmission, and by storing and transmitting it when the host computer is ready. Sometimes there is no single, large host computer. Even when there is, it usually is a complex of special-purpose machines {communiations processors, data base "engines", vector processors, etc.~. These can provide functional dispersal of the processing load. Organizations that are widely dispersed geographically save communications costs by distributing or dispersing their processing capacity to centers of major activity or to geographically concentrated clusters of terminals. A lateral dispersal of this kind is effective when each center or cluster deals largely with data needed primarily in that center or cluster. This kind of lateral dispersal may often be accompanied by a higher level of functional dispersal as - well (e.g., to supply and accounting), since each function tends to be housed together. Another kind of functional dispersal or distribution tends to follow the managerial hierarchy. At higher levels of the organization, the data of interest tend to be of a strategic or global nature. At lower levels, detailed data concerning local operations (such as the location of an item in a warehouse) may be required. Yet the fact that the item is

OCR for page 20
27 there may be of global interest and the sum of the items in all warehouses may be of strategic interest. Because the data required at higher levels are different in kind and typically simpler in detail, but may be processed or summarized in complex ways, it is convenient and often economical to treat- this hierarchical dispersal as one would treat a functional or geographic one, with host computers chosen or configured to serve a particular organizational level efficiently. AS processing capability becomes cheaper and available in a wider range of modular sizes, it becomes economical to disperse or distribute processing more widely. Similar considerations apply to functional and hierarchical dispersal. Dispersed or distributed processors may be linked by any or all of a variety of means, including coaxial or fiber-optic cables, microwave links, or satellite communications. In fact, the Air Force has already settled on a structure for the Base Level Automation System that is widely dispersed geographically. In the future, greater dispersal even at a single air base may prove economical and operationally desirable. Figure 1 illustrates a model automated data processing system embodying these principles of dispersal or distribution, keyed to the hierarchical structure of a manufacturing organization or, in a parallel display, to operations at an air base. Some further properties and requirements of such a system are: . There may be multiple computers at each level, especially at upper levels. Communications between officer may be by telecommunications, magnetic tape, or other medium. They should, however, be in machine-readable form. The data bases on different computers may be related. Con- sequently, maintaining data integrity over distributed data bases is essential. Individual components of the information system are distrib- uted to where they are needed. They should be able to cope with communications links that are temporarily cut. Enough buffering should exist in each local system to maintain essential short-term operations and restore data bases. Simplicity and reliability are achieved by allocating func- tions to specific and perhaps specialized hardware. In addition, where reliability and availability are essential, dual and/or fail-safe equipment can be installed. Sending only the information needed at a given level mini- mizes communications requirements. When lower-level organizations on different branches need to communicate, this can be done by going down the organiza- tional tree and then up. In hierarchical structures, this is rarely needed.

OCR for page 20
28 AIR FORCE BASE GLOBAL SysTEM LEVELS I ncreasing: Transaction Rates I nformation Rates On-Line Reliability Availability of Computer System s Availability of Communications Communications Capacity 1 MANUFACTURING A SYSTEM LEVELS GLOB L Air Force or Command Level Computers . Base Host Computers Business | Host Factory Host Process Functional Monitori ng or Tenant or Control Computers Functional System '1 ' .1 . Machine I nstrument Test, Control, Control, or Terminals , Terminals LOCAL LOCAL ALL LEVELS: Integrity of data, good recovery and restart, high information transfer rates. FIGURE 1 I ncreasing: Complexity of Data Complexity of Calculations

OCR for page 20
29 In addition to technical considerations already touched upon, factors of personality and management style are important. For example, managers may not want the next higher levels of management to control or have access to their entire data bases, preferring to retain control of their total resources. Such nontechnical factors tend also to favor distributed systems. Most of the above reasoning appears to apply to Air Force information systems. Because of the Air Force's mission, the final system may be more distributed than that planned for the initial implementation of Phase IV. However, it is certainly not true of operational units, such as combat units that may be moved almost anywhere in the world on very short notice. Most locations, for example, lack the base support systems available at major air bases in the United States. In addition, remote locations may lack communications and computing facilities, or such facilities may be damaged or destroyed in war. Under such circumstances, each potentially mobile unit could profitably have the hardware, software, and data bases to support its operations. The technical means to do this is either here or just around the corner. In the future, each aircraft wing, for example, will probably be able to take its own information system along when it deploys to another base. The host computers at each major base in Phase IV should be able to support such mobile information systems. As the Base Level Automation System evolves, intelligent automated data gathering, intelligent terminals, and microcomputers will probably reach much farther down in the base support/tenant hierarchy. For example, recording flight and engine histories in machine-readable form is presently feasible at a cost in weight and power that will undoubtedly decrease in the future. Voice input may be used to capture the details of maintenance functions, and voice output to coach a technician in a specialized task. The use of computer terminals by maintenance personnel could give them ready access to current technical data and guidance. These few suggestions illustrate the many ways in which the Base Level Automation System may well extend its services. Nevertheless, the system's structure will continue to need centralized host data processing, even where many computing elements already function. These needs include: The consolidation of common ar~d/or global data. General-purpose batch, time-sharing, and transaction processing. The provision of a centralized software engineering environment and the establishment and control of interface, data base, and development standards. The integration of maintenance activities throughout the hierarchy. Specialized software and hardware

OCR for page 20
30 6. SOFTWARE TRENDS While current software systems can be adapted for the transition into the Phase IV Base Level Automation System, new programs will surely be needed as applications are modified, replaced, or upgraded during the program's later stages. In this regard, the software used in the middle to late 1980's should not be constrained by whatever temporary expedients may have been necessary for the transition phase. Failure to match software to hardware for best program results will hamper development and increase maintenance costs. It Is precisely these costs that dominate in all large data processing systems. The programs for the current base level system, including those that are unique to specific Air Force commands, have been developed over a long time. They represent about 5 million to 6 million lines of code. The significance of the software problem can be put into perspective by performing the arithmetic implied by the following commonly recognized rules of thumb: Producing a new program costs more than $20 per line of source code. Maintaining a program over its lifetime can cost four times the original development costs. Current programmer productivity is about 10-20 lines per day. While computer hardware costs have dropped sharply, there has not been a corresponding sharp decrease in the cost of producing computer programs. Programming costs have risen more rapidly than the rate of general inflation and in many systems are several times more than the hardware costs. The implications of this fact are not con f ined to the Air Force's Base Level Automation System. Users and developers of data processing systems throughout the world face escalating software costs coupled with a very slow rise in programmer productivity. In an effort to improve programmer productivity, a number of techniques have evolved. In general, one attempts to bring more discipline into the management aspects of software development teams and to frame the computer program itself in some very orderly structure. Recent developments include: . . Structured programming (a categoric term for any approach that imposes an orderly process on the management of development teams or on the details of the program itself). Top-down design (building a computer program by starting at the most general level and gradually extending-the program downward into more detail), with provision for the next step in the design at each level of progression into detail. The use of chief programmer teams that attempt to divide each large program into a number of smaller modules, each sized to be manageable by a small team of three to five members beaded by a very experienced and competent chief programmer. The notion is to keep the teams small enough to l

OCR for page 20
31 . ease communication among their members and, at the same time, keep their programming tasks modest enough that the basic structure of each task can readily be kept in the minds of the appropriate team members. Hierarchy plus input-processing-output (HIPO), an aspect of top-down design in a structured sytem. Originally a documentation tool, it has become a technique for describing each function or module of a large program in terms of its inputs and outputs. Each of the techniques above is described fully in the literature and is, therefore, not described in depth here. The important point is that techniques for structuring and organizing complex computer programs and approaches for managing the teams and organizations to develop them have progressed significantly throughout the 1970s. Organizations that have been in program development for longer than a decade or so tend to remember the old ways of doing things and may not actively exploit the most productive new ways to conceptualize and structure complex programs. Users as Programmers The first users of digital computers were scientists who could translate mathematical concepts into detailed sequences of instructions for computers to fOllowe As time has progressed, a class of professional programmers has gradually merged who build complex computer programs in response to detailed statements of requirements from end users. Today, programming involves an intricate set of repeated interactions among programming organizations, end-user organizations, testing organizations, and sometimes still others. Typically, ultimate users cannot describe their requirements precisely enough to enable a program, when first completed, to meet their needs. Therefore, the various players in the process interact repeatedly as the program moves toward completion, consuming the time of both programmers and users. With the emergence of programming as a professional skill has come a progression of so-called higher order languages. Such languages permit users to state needs at a more abstract and general level. Among such languages are FORTRAN, COBOL, PL-1, PASCAL, and ADA, each of which is intended to improve programming productivity by raising the level of abstraction at which programmers could function. Concurrently, computer costs have dropped dramatically. Today, then, there is less emphasis on producing programs that are extremely efficient and make best use of computer resources. As a consequence of these trends, it has been possible in many commercial applications of ADP to write the basic software so that users themselves can program specific applications to meet their needs. This practice is particularly effective when users are also given access to program libraries and to good software development tools, such as discussed below.

OCR for page 20
32 Programming Cost Problems in Large Systems In the Base Level Automation System, like many of its civil counterparts, personnel costs greatly exceed those for equipment. In addition to a corps of 610 in a centralized facility that develops most software (the Air Force Data Systems Design Center), the Air Force has approximately 40 people at every base assigned to computer operations, support, and management. This total cadre of 4555 people operates the base computer system at shift levels exceeding 15--at many bases at levels of 20 or 21--per week. In addition to the management and program structuring techniques that are intended to increase the productivity of programmers and users, there is also the question of the work environment. In the past, programs were written out laboriously on paper and then transcribed onto punabed cards that were read into a computer for execution. In the best of today's program development environments, the programmer--professional or not--works at a terminal coupled directly to a computer. He keys instructions directly into the computer. The best environments also include features for verifying the work and remedying the more common or obvious mistakes. Finally, when a program is ready for trial execution, it can be submitted directly to the computer through the terminal. When the job is completed, the results can also be verified through the terminal. The on-line development of computer programs is an enormous boon to a programmer's productivity. It not only allows him to communicate more directly with the machine, but it also supports him with a wide variety of aids that take care of some programming details automatically, check for mistakes, and facilitate searabes for other errors that are latent in the programs. Thus, the Air Force Data Systems Design Center, a large, professional, systems developmetn organization, needs the most modern software development facilities available for its personnel. Most vendors of contemporary computer systems include such capabilities in their product lines. It is even possible to get a program development facility from one vendor that can be used to produce programs for computers of other vendors. The payoff that an organization expects from an appropriate program development environment includes faster interaction of programmer and computer, automatic scanning for some errors, automatic debugging and testing, automatic handling of some programming details, and (perhaps more important than anything else) an environment in which the programmer can accomplish his complex, highly detailed job more effeciently. The characteristics normally demanded of a high-performance software development configuration include: . . The ability to support simultaneously multiple CRT terminals, operating with full-screen editing. Subsecond response times for most commands. Powerful text editors capable of string manipulation. Hard copy production of source listings.

OCR for page 20
33 Directory and file manipulation, including file sharing and communications among users. Job submission capabilities to another production systems, if different from the development system. Source code control commands that permit successive versions of source program modules to be controlled, maintained, and retrieved, so that every change can be accounted for and linked to version numbers of released systems. Computers that handle the normal day-to-day operational workload frequently lack features that are essential to a high-guality program development environment. To solve this problem, one option is to add a so-called overhead computer, an additional machine equipped with an appropriate terminal-oriented environment and used only for program development. However, features can also be implemented very efficiently on minicomputers to support dozens of simultaneous users and produce end programs for a Phase IV machine. 7. EVOLVING USES OF MICROCOMPUTERS AND MINICOMPUTERS There is a number of reasons for the growing use of small systems utilizing microcomputers. For example, there are data processing tasks small enough to be carried out by a microsystem. The advantages of doing so include: Favorable cost (perhaps). Users having their own computing capabilities. No dependence on communications. Avoidance of the overhead involved in running a large central system. Avoidance of large-scale breakdowns. Encouragement of users to be innovative in exploiting computer technology for better efficiency and effectiveness. The one disadvantage--potential loss of discipline through proliferation of small systems--can be controlled by the Air Force through prototype designs from the Air Force Data Systems Design Center (p. 35~. Whatever the reasons why small computers will have growing importance at Air Force bases, it is likely that any such applications will have information interfaces with the Phase IV base-level installations, raising both hardware and software questions. For example, a Base Level Automation Program site might need the ability to either read or write 5-inch or 8-inch floppy discs, or the ability to produce a subset of a master file for the use only of the personnel of a deploying aircraft wing. Quite aside from the hardware and software questions there are important system-level questions. Suppose, for example, tht the Air Force wanted to automate the handling of travel vouchers on a small, stand-alone system. Offhand, it would appear to be a self-contained application, but there are many

OCR for page 20
34 interconnections that must be made. For example, there is the matter of overall budget control. Further, a local travel voucher system might share just a portion of some ~ ~ ~ larger data base dealing with the same issue. Thus, there is a system-level question involving the information interface between a satellite, a stand-alone computer and a central one. What might, should, or must be kept in the small computer to accomplish the local objectives? What must be kept centrally to satisfy such requirements as audit trails, overall fiscal accountability, and reporting to higher headquarters? There might be some data that should or could be kept in both places; but, if so, the master file might have to be updated from the local file, and the Air Force would have to determine how frequently that must be done. What information must be passed to higher echelons or laterally to corresponding users at other bases? Thus, an information flow analysis should be conducted for any functional capability proposed for a small stand-alone computer, to identify all the interfaces that need be accommodated. The organization proposing a local capability probably could not do such an analysis, simply because it might not be aware of the overall Air Force context in which its particular activity is embedded. Technology itself is not the paramount problem, even though there are technical aspects of data exchange among computer systems {e.g., data formats, disc details). Rather, it is an understanding of how information is used at local, central, and higher echelons, together with all of the details of such flows. A related issue is that of software life-cycle support. Continuing the travel voucher example, the Air Force would want to ensure that it would be done uniformly at all bases. Even if one base were to take the lead in designing and developing such a system, that same base should not necessarily be forever responsible for supporting the software. The senior authority for the functional area in question might undertaken such support (e.g., the Air Force Accounting and Finance Center for anything involving financial transactions). But there may not be an appropriate senior authority for some stand-alone applications that might be desirable, or the appropriate authority may not desire the additional burden. Thus, there are aspects of proliferating small stand-alone computer systems that could create problems for the Air Force unless some level of overall control is maintained. A possible technique for this, and for leadership from the Air Force Data Systems Design Center, is to have the Center make prototype designs, for widespread use at all appropriate bases, from small-system designs developed by local users. Alternatively, some particular base could be selected to be a "prototype environment" base, to illustrate the ways in which evolutionary changes to Phase IV systems are applied in normal base operations.

OCR for page 20
35 Office Automation as Related to Base Systems An important use for small computers at base level is bound up in the ubiquitous word processor, which can grow into a more fully capable office support system that has interfaces with other base services. Almost from the beginning, on-line computer systems included text editors used to develop programs. These editors were soon enhanced by the addition of routines that standardized output. Thus, computers could process text as readily as numbers. Somewhat in parallel, typewriters were connected to simple memory devices like magnetic cards. With the advent of microprocessors, these memory typewriters quickly evolved into word processors that could not only handle text, but could also process or compute numbers. Now, communications link computers and word processors. This marriage of computing, word processing, and communications has given rise to the concept of the automated office. Information of all kinds can easily be transmitted locally or globally, and it can be processed mathematically or editorially. This has led to such services as Electronic mail" and Paperless processes, which are beginning to appear in the commercial marketplace. Electronic mail's significance does not come from transmitting information by electrical signals. That basic capability had been available for a long time. Rather, its significance is that information can be both transmitted and processed by a variety of equipment. Prescribed formats no longer hinder informtion flow or arbitrarily limit its content. A high degree of local autonomy is possible because necessary data can easily be extracted, recombined, manipulated, and displayed as required by each user. Typically, office automation begins with a word processor. Even for skilled typists, word processors can improve productivity, particularly in preparing a lengthy document to which a number of people make contributions, and in which various sections must be revised and re-revised. As a second step, the word processor can be connected to other word processors or computers through the telephone system or other communications network. From this point on, the benefits of office automation accrue in proportion to the degree of management planning and user feedback. Step-by-step implementation is a practical approach, but it must be developed in the context of an overall plan. The advantage is that data once entered into the system need never be entered again. It is free to be moved and transformed to serve the needs of management where and as required. This can led to integrated, end-to-end processes that considerably improve the performances of the functions they control. Thus, office automation is not a single system or event; it is the long-term application of computing, communications, and office machine technology to business processes. Just as computers relieved people from the drudgery and limitations of manual arithmetic, so too can office automation relieve the drudgery and limitations of many clerical tasks. It not only improves the productivity of clerical

OCR for page 20
36 workers, but it should materially improve the quality of the process as well. The Air Force Data Systems Design Center might well have a significant role in the growth of office automation in the Air Force, especially since the Center is part of the Air Force Communications Command. Among its functions, the Center supplies all the on-base communications to support office automation networks, in addition to communications in support of the Phase IV installation and networking of microprocessors. 8. BASE LEVEL SYSTEMS IN WARTIME In peacetime, the Air Force emphasizes efficiency and minimum costs. In wartime, the emphasis switches to such criteria as combat readiness, maximum availability of aircraft, and prompt maintenance. The base information infrastructure will have to accommodate two directions of change. First, the information-handling demand on base is likely to change dramatically by the 1990's. Second, the base must be ready to move whenever necessary from a peacetime status to one of military action promptly and smoothly. In this context, military action can mean anything from full-scale nuclear warfare, through theater operations (e.g., in Europe), to potential crisis involvement in any country of the world. Thus, in planning its future base level information handling, the Air Force must ensure enough flexibility in its information systems to accommodate the variety of bases from which it operates. Its plans also have to accommodate the transition of some or all bases into military-action status at any time. While there is an awareness of wartime planning and there are exercises for practicing contingencies, the fact is that peacetime operation tends to dominate attitudes, thinking, and planning. In peacetime, motivations are those of efficiency, minimum cost, budget consciousness, and federal government funding cycles. During a military action, motivations will be reoriented toward such things as overall effectiveness, quick turnaround of sorties, maximum availability of aircraft, and prompt maintenance. Information flows--who transmits what data to whom, and why--are very likely to change correspondingly. The pattern of information exchange that characterizes a base during peacetime will be different during military action. New data may have to be collected, maintained, or manipulated; new uses for data may appear, or new users of data may materialize. The quantity of data flowing among established users may surge or, in some cases, vanish. For example, to offset an impaired flow of spare parts, some aircraft may be cannibalized to keep others flying. New data handling demands could arise from such circumstances. Similarly, it might prove desirable in a wartime emergency to exchange spare parts among bases; if so, a substantial lateral flow of data could develop. The Air Force, of course, does its best to estimate what data systems will be needed in wartime. Presumably, it also will seek to

OCR for page 20
37 estimate how wartime information needs will differ from those of peace. However, there is something of an unspoken assumption that war or other military actions will be like peace insofar as data requirements are concerned. And, to the extent that it is not, the prevailing philosophy seems to be, ''We will make it work." When information was handled manually at each base, it was possible to innovate with alacrity and to create new recordkeeping processes with reasonable speed. At a time, however, when much of record handling has been computerized, innovative actions to accommodate deficiencies that appear only during wartime will at best be difficult and may be impossible. Air Force bases might find themselves in a major administrative problem. In addition, the data processing installations at most Air Force bases in Europe and the Pacific have not been adequately protected against wartime threats. They are vulnerable to damage, disruption, or destruction by air attack, electromagnetic pulse, sabotage, or ground assult. The likelihood of continuity of normal processing in the face of these threats is quite low. The Committee has been made aware of Air Force plans to make the Base Level Automation System combat-ready and encourages the Air Force to give this planning a high priority. The Committee was told of plans to develop a combat supply system that would assume many of the supply functions performed by the Base Level Automation System, but would operate on small scale computers that could be deployed with fighting to satisfy all wartime support responsibilities. One particular concern to the Committee, for example, is the engine tracking system, which is vital for combat readiness of squadron aircraft. That system depends on a computer system that may not survive an attack or redeployment. A wartime equivalent or substitute for that system, implemented on an appropriate system that is transportable and survivable, would probably be justified because of its value to the operational readiness of a deployed combat wing. Other wartime contingency measures being undertaken by the Air Force include the following: A comprehensive analysis, managed by the Air Force Data Systems Design Center, to examine contingency planning requirements in Europe, the Pacific, and the Tactical Air Command (for rapid deployment force requirements). Development of improved methods for reducing the vulnerability of base-level computer systems to enemy attack. This effort has included a MITRE study that recommended specific plants to develop ~transportable" and survivable computer systems that are packaged in small enough containers to be moved readily by air for long-term support requirements (60-90 days after deployment). Acquisition of a small computer system to support combat supply requirements that can be easily moved with deploying forces in the very early stages of conflict. Additional systems may be acquired to support other functional users.

OCR for page 20
38 These actions are all necessary to the development of an adequate wartime ADP system for supporting essential administrative and operational functions. The Committee has not, however, seen enough detail to assure it that a comprehensive approach is being taken to the planning of a system adequate for wartime operations. Although the Committee has not pursued this matter further, it must strongly recommend that the Air Force conduct comprehensive planning to assure the adequacy of the base-level system to wartime as well as peacetime operations.