National Academies Press: OpenBook

Managing Microcomputers in Large Organizations (1985)

Chapter: Faster, Smaller, Cheaper: Trends in Microcomputer Technology

« Previous: Introduction
Suggested Citation:"Faster, Smaller, Cheaper: Trends in Microcomputer Technology." National Research Council. 1985. Managing Microcomputers in Large Organizations. Washington, DC: The National Academies Press. doi: 10.17226/167.
×
Page 19
Suggested Citation:"Faster, Smaller, Cheaper: Trends in Microcomputer Technology." National Research Council. 1985. Managing Microcomputers in Large Organizations. Washington, DC: The National Academies Press. doi: 10.17226/167.
×
Page 20
Suggested Citation:"Faster, Smaller, Cheaper: Trends in Microcomputer Technology." National Research Council. 1985. Managing Microcomputers in Large Organizations. Washington, DC: The National Academies Press. doi: 10.17226/167.
×
Page 21
Suggested Citation:"Faster, Smaller, Cheaper: Trends in Microcomputer Technology." National Research Council. 1985. Managing Microcomputers in Large Organizations. Washington, DC: The National Academies Press. doi: 10.17226/167.
×
Page 22
Suggested Citation:"Faster, Smaller, Cheaper: Trends in Microcomputer Technology." National Research Council. 1985. Managing Microcomputers in Large Organizations. Washington, DC: The National Academies Press. doi: 10.17226/167.
×
Page 23
Suggested Citation:"Faster, Smaller, Cheaper: Trends in Microcomputer Technology." National Research Council. 1985. Managing Microcomputers in Large Organizations. Washington, DC: The National Academies Press. doi: 10.17226/167.
×
Page 24
Suggested Citation:"Faster, Smaller, Cheaper: Trends in Microcomputer Technology." National Research Council. 1985. Managing Microcomputers in Large Organizations. Washington, DC: The National Academies Press. doi: 10.17226/167.
×
Page 25
Suggested Citation:"Faster, Smaller, Cheaper: Trends in Microcomputer Technology." National Research Council. 1985. Managing Microcomputers in Large Organizations. Washington, DC: The National Academies Press. doi: 10.17226/167.
×
Page 26
Suggested Citation:"Faster, Smaller, Cheaper: Trends in Microcomputer Technology." National Research Council. 1985. Managing Microcomputers in Large Organizations. Washington, DC: The National Academies Press. doi: 10.17226/167.
×
Page 27

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Faster, Smaller, Cheaper Trends in Microcomputer Technology Thomas H. Wilimott* At International Data Corporation we do market research and competitive analysis for both vendors and users of information processing equipment. Therefore, when I look at where the tech- nology has been and where it is going, I see it from two view- points: where the industry is moving and how well users are adapting to the changes. But from both viewpoints the move- ment can be summarized in three words: faster, smaller, and cheaper. As we look at the past and future of microcomputers it may be helpful to think in terms of a microcomputer's life cycle. I wouIc! suggest that there are three phases to this cycle. The first is hard- ware introduction, during which time users have the opportunity to review hardware capabilities. The second phase is the response to that original debut- whether it attracts peripheral vendors of hardware and stimulates software development that provides a wide and rich work environment. No corporation, not even IBM, can stand alone in this marketplace. All need the diversity of the expansion board vendors, the software vendors, the whole new industry that has emerged over the last few years. Thus, in the second phase it is critical to sense whether the new machine is *Thomas H. Willmott is director of user programs and a personal computer analyst for International Data Corporation, Framingham, Massachusetts. 19

20 MANAGING MICROCOMPUTERS attracting the needed kind of support. If it is, the third phase may extend well beyond the technologically useful life of the machine. It may not be the smartest processor on the block, but because of its installed base of software and its range of capabilities it is sufficient to the task at hand for the two, three, or four years neecled to amortize the equipment. If we think in historical terms, it's obvious that what we have seen so far in personal computers and microcomputers is merely an introduction to what will be coming. In the early 1960s main- frame computers were placed behind plate-glass windows ant! handled by men and women in white coats. Approaching the data processing facility was like going to the mountain, which Mo- hammect had to do, even if he was the president of the corporation. In its first stage of development the personal computer, too, was seen as a curiosity. End users found themselves relatively unpre- pared in technological terms to clear with microcomputers in any meaningful way. The industry had to translate the buzzwords for the end-user group; the technology, though smaller, was still rather exotic. In this first stage we clealt with the personal computer as an incliviclual workstation, as an independent processing unit rather than as part of a larger organizational framework. Looked upon perhaps as a toy by the MIS (management information systems department, it was viewed with suspicion, as something that wasn't really part of the computer resources facility. We also hac3 to clear with the whole area of shared peripherals, secondary to whatever unit was on the desk. A Winchester disk that cost $2,500, for example, was difficult to justify for a microcomputer that soIct for $1,995. Thus we ended up with rolling resources printers on a cart, which could be moved around a department and shared among a number of people. This was a rudimentary ap- proach to what would in time become a sophisticated data-pro- cessing environment. During the first stage we also had to explain system software to our end users, as well as come to grips with it in terms of manage- ment decisions. One problem was that a single-user operating sys- tem was completely foreign to the ciata processing environments in which our top managers worked. They were used to much more sophisticated operating systems. Since end users had absolutely no experience with system software, we had to teach them how to use it and explain its role in terms of system responsibility. We

TRENDS lZV MICROCOMPUTER TECHNOLOGY 21 had to deal with many poor hardware and software decisions macle by uneclucated end users who purchaser! equipment for their departments. We found ourselves buying hardware that couIct not do the job it was assigned to do because application software was dreadfully lacking. Jean Piaget, a psychologist who specialized in cognitive devel- opment, had a theory about indivicluals being thrown off equilib- rium and then assimilating information on a new subject until they reached equilibrium again. In its first stage of development the microcomputer created a similar disequilibrium environment, and only now are end users and managers coming to grips with the issues raised by the microcomputer as a small, cheap work- station. During this first stage we also had to deal with how application software was going to be definecI and developecI. Would it focus on task-specific, horizontal packages such as database managers, word processors, and spreadsheets? Or wouIct the thrust be for vertical markets, where an application could be developed that solved a total business problem? In terms of horizontal packages, we began to move from single-function to integrated software. In the vertical market there was less progress because there were fewer opportunities both from a research and development stand- point and from a financial, business standpoint. Hardware was still a precious resource in this first stage. Mem- ory was a governing factor. The 8-bit CP/M operating system and a number of software graphics packages based on 8-bit technol- ogy were designed to do somersaults within a small space because memory and disk storage capacities were at a premium. Stage-one hardware was supported by a cottage industry of software pro- grammers who appeared highly suspect to large companies and federal agencies. One of these new companies, with perhaps 10 to 12 employees, wouIc! have the hottest package in the world but a balance sheet that would make your personal checkbook look proud. We were not used to doing business with these kinds of A. firms. As we mover! toward the end of stage one we began to take a look at what else we could do with a microcomputer. We had fig- ured out what it could do for us locally; now we began to look at the larger environment: the corporation as a whole, the fecleral agency, the state government computing resource. In the seconc! stage of microcomputer development, which is

22 MASSAGING MICROCOMPUTERS where we are today, we are more likely to see the microcomputer as a window on a greater distributed resource system. Today we look out from our microcomputers and talk either to a mainframe, a strip-file on a minicomputer, or to a larger communications da- tabase technology. We may be tying into other organizations or services, into other resources in our own corporations, or perhaps into a local group of microcomputers, where we can share files in a more sophisticated fashion than ever before. Today the micro- computer is associated more with communications and with com- puting on a grand scale than it is with the idea of a stand-alone workstation. Processing capabilities are still important at the lo- cal level, but concurrent processes are more important. These in- clude multiple windows on the same terminal which allow the user to be active in a job that runs on a mainframe and also to merge that data into local files and manipulate it at local memory levels. The key issue we are now facing is micro-to-mainframe commu- nication. We are trying to answer questions about how the micro- computer fits into a total computing capability. Where are the remote data? Are they available to the ens! user? Shall they be macle available? How often will they be upciated? How does all this affect decision making? The microcomputer is coming of age. In this seconc! stage of development the communications capabil- ity will increase rather than decrease; the clemancis for even broacI-band communications links may well be in place within the next two years. The whole area of microprocessor technology, peripheral chips, and small circuit technology in general has brought additional pressures to bear on the MIS or ADP (automated data process- ing) planning branch. Microcomputers are only one symptom of the total technological wave that is rolling over us. We are being forced to make management changes as well, not only in the area of microcomputer planning and acquisition training, but in our whole ADP planning and telecommunications staff. We have a communications group on one hand, and an ADP planning func- tion on the other. Microprocessor technology ant! microcompu- ters have forced an organizational change in the relation between the two types of jobs in the communications area. Microprocessor technology has in one way made capacity plan- ning obsolete, not in terms of the skills or people required, but simply because movement is no longer from one huge machine to another. Smaller steps are now involved. We are more concerned

TRENDS TV MICROCOMPUTER TECHNOLOGY 23 with how to develop application software across an entire com- puting resource. Clearly, we have more options in the area of ca- pacity planning than we had several years ago when the only com- puting facility for a new application was a large mainframe or perhaps a distributed mini with a dedicated voice-grade line. To- day we have an entire range of processing capabilities throughout our organizations, ranging from small desktop equipment to much larger machines. The variety of options offered by the new technology means that traditional job functions have to change to maintain efficiency within the organization. Integrated circuit technology is the driving force of equipment development occurring in the second stage. And we almost have a second generation of the distributed resource, which will include small, local [Loops of technology (a loop of intelligent workstations or a loop of Apple computers, for example!; front ends to large mainframes; twin minis, which hancIle strip files for management decision making; and remote communications to other networks. Based on effective management decisions concerning cost/benefit analysis, these are the directions in which we will be moving in the mid-1980s in terms of a totally distributed information resource. It is absolutely critical that organizations be concerned not only with the technology but also with how the technology will serve our business, raise our profit levels, and help our service. I suggest that the key issue to keep in mind when making deci- sions about the management of computers is that what we are cloing with each new level of technology is serving a wider aucli- ence of potential users. When we had five people in white coats overseeing the major computer in the computer center, the man- agement problem was relatively easy. We delivered data to the center, pumped it through, and if everything went according to plan, the machine gave us a printout that we could distribute on a rolling cart. Today there is still the need for that kind of applica- tion; batch applications for payroll and receivables and clata- crunching operations have not gone away. However, as the tech- nology becomes cheaper, we can serve more people with different types of equipment and software. If we take a Took at what has been happening from the vendors' standpoint we fine! that in the early 1980s the 6502 and the Z80 chips were the key foundations for microcomputer development. Software developers were moving the fastest in these areas and, therefore, they attracted! hardware peripheral vendors, aciclitional

24 MANAGING MICROCOMPUTERS software development, ant! user involvement in the applications clevelopment process at the site. These two early microprocessors have now been relegated to the sidelines. Their address-based lim- itations have caused people in the business graphics market, for example, to shift to larger 16- ant! 32-bit types of hardware. Thus, to purchase the oilier equipment today may be a bad business decision, though there may be nothing wrong with continuing to use it effectively if we already have it. Because of market dy- namics, the purchase of both hardware and software is an impor- tant business decision. We clon't necessarily want our program- mers to have to solve all our application needs. Of course, if we have embezzled systems or unique applications, we will want to be able to quickly develop new applications that meet our specifica- tions. But we also want to be able to make use of new applications in the field. It now appears that the Intel 16-/32-bit chips (the 8088 ant! per- haps later the 188) may provide some stability in the area of the general-purpose business too} like the IBM PC. However, the Ap- ple McIntosh is coming in at a relatively inexpensive level using the 68000 from Motorola, and others will be fast on its heels as a third generation of application software is cleveloped. We can anticipate some exciting things for the future. These include object-oriented architectures-dealing in objects rather than continuous lines of code that move from one object to an- other and have the kind of transfer talked about in the same breath with artificial intelligence. We will also have additional ca- pabilities on the chip itself, which will give another level of sophis- tication. However eagerly we may anticipate such future developments, it is not necessarily best to wait for them to happen. There are many things an organization can do now to improve its productiv- ity, even beyond an individual workstation application to make that crevice earn its keep. To be concerned about technological obsolescence is a good idea from the standpoint of long-range planning, but it shouIcin't prevent us from getting new work done now. In the near term one of the major technological impacts will be in the area of peripheral chips. This is a key area in terms of the smaller, cheaper, faster syndrome. The ability to put all kinds of sophisticated programming locally at the chip level and integrate it into the workstation gives the user a smaller, cleaner, ant! more

TRENDS 1lV MICROCOMPUTER TECHNOLOGY 25 powerful device. This is a characteristic of the second stage. For users stage two means looking out across a larger network of ca- pabilities; for the vendor it means delivering more power at lower cost, with peripheral technology as well as declicated distributes! resources. Modem chips are one example of peripheral chip clevel- opment; they give the capability to put a modem on an expansion board and simply have a wire attachment coming out of the per- sonal computer. I think peripheral chip technology will make the whole area of wore! processing protocol conversion a nonissue in two or three years, when we will be able to encode all of a system's character formats and control codes at the chip level all trans- parent to the end user. What is apparent in this discussion of stage-two developments is that vendors have not and managers of computer user environ- ments shouIct not underestimate the demand both for sophisti- cated devices with interfaces that are easy to use ant! for powerful microprocessor systems. This demand changes the character of the capacity planning capability that I clefined earlier. Where will we deliver the application of software? How will we develop it? Where should it be located in our network of processing capabili- ties? The question is not how much we can get, but what good use can we put it to? And further, how much raw processing power clo we need at the desk? This last question is part of the larger ques- tion where clo we neec! power in our organization? It is likely that in just a few years, we will walk into a large mainframe computer facility ant! see 40 people wandering around a processor the size of a file cabinet. A brief look at what is pres- ently available in terms of off-Ioading power from the mainframe and the minicomputer suggests some of the possibilities for the future: Lisa. Though ~ hate rodents and think the Mouse is overrated and takes away from keyboarding capabilities, Lisa suggests the potential of multiple windows. The ability to process numerous applications at one time and to move from one process to another as we wouIc! with pieces of paper on our desk is clearly in our future. Apollo. Apollo represents the workstation of the future. (One is already available for $9,000.) With tremendous horsepower, and capable of relatively simplified artificial intelligence re- search, Apollo was chosen as the lead system at the Yale Artificial

26 MANAGING MICROCOMPUTERS Intelligence Labs on the basis of cost performance, multiple con- current processes, anct a domain operating system in which a number of units in the network share their own resources. Rather than having a mainframe, one key computer facility, Apollo will have its intelligence distributed around the network, with acIdi- tional capacity available at any time if needed. Apollo represents a new philosophy. It defines domain as being able to reach out, not only to share files but to gain processing power from other work- stations. Synapse. Another company that has been involved in sharing processing power is Synapse. With each additional user a micro- processor is added into a back-playing system. The controlling logic of the system distributes the resources of the microproces- sors in the network so that the user is not limited to the power at the individual workstation. From a historical perspective, development has clearly been from mainframe to microcomputer to the concept of the work- station as a window onto a clistributecI resources network. There has been clevelopment in other directions as well: Waring. A recent offering from Wang gives a sense of the other capabilities that will become available. The new Wang PIC, a jazzed-up version of its personal computer, works with a charge- coupled device technology to digitize a piece of paper. The user places the piece of paper containing graphics on the workstation platform and presses a button on the keyboard; the bit-map of the image is blown into a bit-map on the screen and is integrated into the text capability. Thus we have the integration of text and graphics at a very sophisticated level as well as the ability to store this in digital form on a drive. One of the key issues to keep in mind as we move more and more towarc! graphics and high resolution bit-map displays is the ne- cessity for high communications speecis to refresh the screens at appropriate intervals. Another element seen in some offices today is the HP-7475A, a 2- or 3-pin plotter available for under $1,000 ant! capable of cloing excellent business graphics in work environ- ments as a part of the local PC station. Cynthia Peripherals. A subsidiary of Honeywell, Cynthia Pe- ripherals has been involvecI in Winchester disk technology, espe- cially removable disk technologies. Winchester technology had provided a very large volume storage capability 10 megabytes

TRENDS TV MICROCOMPUTER TECHNOLOGY 27 or more. When those disk platters were full, however, the user hac! to buy a new one, an investment of $2,500 or more. We now have removable media. The cover of the Cynthia Peripherals device flops down and the user can pull out a 10-megabyte cartridge. This is perfect for things like electronic mail where files multiply like crazy. Now we can issue a 10-megabyte disk that has archival capabilities. With such devices we have continued to off-Ioad resources from the mainframe ant! even from the minicomputer. The result is that we now have a number of interesting capabilities at the local level. Such technology clearly requires a sophisticated manage- ment environment. We are moving toward a stage probably near the end of the decacle in which data will be available in data banks at remote locations, artificial intelligence will be incorpo- rated at the hardware level, and logic chips and wafer technology will permit the incredible capability of a 100 million instructions per second. From the user's perspective the key issue in this new stage is that the microcomputer becomes an extension of oneself, a transparent too] for the worker.

Next: Trends in Personal Computer Software »
  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!