Skip to main content

Currently Skimming:


Pages 19-74

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 19...
... Pa I I: A VIEW OF NINE C =~ Sag ~ Came DIVE
From page 21...
... Arden, MIT Press, 1980~. Paper 2 of Part II, about integrated circuits in general and microprocessors in particular, examines currents in these most characteristic and exploding fields of hardware technology.
From page 22...
... and with general systems issues (Paper 9) but rarely with data processing per se (Paper 8~.
From page 23...
... Paper 1 SYSTEMS SOFTWARE: THE PROBLEMS KEEP CHANGING I NTRODUCTION Much of the history of software systems research reflects the efforts of academic codifiers to find stable, unifying generalization in a field continually revolutionized by changes in hardware technology. The following somewhat oversimplified account of major states in the development of hardware will emphasize the constantly shifting priorities with which systems researchers have had to contend.
From page 24...
... . later MIT work on MULTICS also explored the memory protection and segmentation schemes needed to support highly dynamic shared use of large machines.
From page 25...
... The communications software needed for this was pioneered by the ARPANET effort, which was largely the work of nonmanufacturer software research laboratories with some participation by university groups. This effort demonstrated many important packet-switching
From page 26...
... Nevertheless, software research has a number of real accomplishments to its credit: 1. Such fundamental systems approaches as multiprogramming, time sharing, and virtual memory were all initially demonstrated at universities.
From page 27...
... They have tried to 1. demonstrate new and fundamental systems software concepts; 2.
From page 28...
... The efforts they require can be undertaken by individual researchers or small groups, producing results that are often available within a short time and often discreet enough to be immediately suitable for publication. By contrast, university researchers attempting to play role 4 often find themselves on unfavorable ground.
From page 29...
... If this tendency can be overcome, the availability of powerful small machines may make it possible for university software developers to affect developments to affect practice more directly; otherwise commercial software developers and practitioners with backgrounds in a variety of application areas will call the tune. At any rate, it would certainly be unfortunate for university researchers to ignore the views of this latter group.
From page 30...
... m is very successful line of work furnishes a prime example of a heavily development-oriented area of technological innovation that has been dominated by industry. Industrial organizations have the capital necessary for activity in this area, have had an immediate interest in its outcome, and were the ~ , I, however, the problems faced in perfecting complex chip designs have encouraged universities to build up VLSI design automation activities.
From page 31...
... Currently fabricated chips generally use a MOS (metal oxide semiconductor) technology, though limited but significant use is also made of other integrated circuit technologies.
From page 32...
... Between 1959 and the construction of the first microprocessor, the Intel 4004, many more developments in integrated circuit technology occurred, but none of these were so fundamental as the original inventions of the transistor, the planar transistor, and the integrated circuit, all three of which are achievements of industry. THE DESIGN AUTOMATION 80TTLENECK In the 1960s it became apparent that rapidly improving integrated circuit technology would soon permit several hundred to several thousand circuits to be fabricated on a single chip.
From page 33...
... The commerical success of these calculator chips set the stage for the development of true microprocessors. When Hoff initiated work on the Tntel 4004 in 1969, intelligent controller chip designs generally involved an embedded, unchangeable program, usually fabricated using an on-chip ROM (read-only memory)
From page 34...
... Subsequent to its introduction of the 4004, Intel enhanced this product greatly to create a line of microprocessors constituting a major product family within the company. By 1975, Intel's product line was dominated by the 8080, a third-generation 8-bit micro that had several times the computational power of the 4004.
From page 35...
... Design automation aids in common use today allow full-chip designs to be constructed as compositions of prestored subdesigns. mese aids are able to elaborate preliminary design sketches into fully detailed drawings, to route interconnections between specified points, to replicate predesigned cells in specified areas of a chip, and to produce logic arrays automatically from high-level specifications of a control function.
From page 36...
... Though Bell Laboratories and IBM are exceptions, most semiconductor firms sponsor very little of this kind of research. Work on algorithms and data structures for VLSI design is another area in which university efforts can easily surpass those of industry.
From page 37...
... Paper 3 I~ORETICAL "SEA=H: ITS PURPOSES AD INFL==E Theoretical computer science is concerned with discovering the inherent power and limitations of computing techniques. This agenda proceeds in both an abstract vein of existence and nonexistence results for various models of computation and a concrete vein of algorithms and general methods for particular classes of problems.
From page 38...
... Some prevalent modes of theoretical research engender isolation. Considerable work in theoretical computer science has been devoted to excessively inward-looking corners of the field whose raison d'etre is partial fulfillment of Ph.D.
From page 39...
... For lack of a better phrase we may call this immediate recognition that causes us to do things right without thinking about them Permanent knowledge. n Codification and transmission of permanent knowledge is the theoretician's responsibility.
From page 40...
... Grudging Awareness: Theoretical Algorithms in Optimizing Compilers Optimization and flow analysis remained outside of fashionable theoretical research for 15 years, until they were perceived as pragmatically significant examples of the combinatoric problems that came in the 1970s to dominate the subject of algorithms. Individual practitioners began to introduce formal techniques into this corner of combinatoric algorithms over a decade ago, but in that whole period the subject of optimizing compilers remained outside the domain of
From page 41...
... Nevertheless a transmissible framework now exists, and code optimization may be expected to join syntatic analysis as a well-established part of programming language technology. Scant Contact: Formal Semantics in Languages The history of formal semantics illustrates a different story.
From page 42...
... 42 HOW IDEAS SPREAD The preceding discussion suggests that the fruits of theory are imprinted in permanent knowledge by catalyzing events, such as the publication of the Algol report or of the quicksort algorithm, but that the ultimate influence of the ideas thus introduced may greatly overshadow the events that initially make them known. m e contributions of theory have diverse provenances: the FET came from outside the computer science community, Algol from a committee, quicksort from an individual, and cryptography from an underground bureaucracy.
From page 43...
... As a consequence, they attracted the attention of many theoretically inclined academic computer scientists. be cleanly put was that of syntax analysis.
From page 44...
... Vyssotsky at Bell Laboratories and was brought to light by John Cooke, who at the time played a key role in a large IBM development group that was designing a new scientific supercomputer that was to be supported by high-efficiency FORTRAN compiling techniques. Papers by Cocke and Allen of IBM and Kildall of the Naval Post-Graduate School brought this graph-theoretical approach to the attention of university computer scientists.
From page 45...
... Since Backus's first IBM FORTRAN compiler, this problem has been studied intensively by a variety of techniques, some ad hoc, others having at least some fragmentary theoretical basis. Early on, register allocation was seen to be related to the classical mathematical problem of graph coloring: assigning a minimum number of colors to the nodes of a graph, subject to the restriction that no pair of nodes connected directly by an edge may be given the same color.
From page 46...
... Recently, however, work at IBM has vindicated the coloring approach by showing experimentally that the conflict graphs actually associated with programs are generally sparse enough in structure to be colorable rapidly by a very straightforward coloring algorithm, provided that the computer for which register allocation is being performed has enough (i.e., more than a dozen) registers.
From page 47...
... Paper 5 ARTIFICIAL INTELLIGENCE: SUSTAINED UNIVERSITY PIONEERING WINS INDUSTRIAL ACCEPTANCE INTRODUCTION Artificial intelligence (AI) is the branch of computer science that collects and studies the so-called Tweak methods" that problem solvers can use when "strong," algorithmic methods are unknown or impractical.
From page 48...
... 48 An essential ingredient of artificial intelligence research is a body of heuristics, or judgmental rules that it has collected. These provide good guesses -- justifiable but not infallible guides for problem solvers and algorithmic methods known for some classes of problems -- and should be used when appropriate.
From page 49...
... In the course of doing this, researchers sometimes have been driven to invent new programming concepts that then become important in other areas of computer science. Examples are alpha-beta search, functional programming, time sharing, list processing, and garbage collection.
From page 50...
... 50 its values and style and those of industrial development. Although industry's willingness to experiment with AI techniques is now growing very rapidly, considerable skepticism remains in industry about the complexity, size, and usability of AI programs.
From page 51...
... Seen in this light, they might be considered close relatives of numerically controlled machine tools, the most sophisticated of which are also regulated by stored programs, but which serve for cutting blank stock rather than for the manipulation and assembly of preformed parts. However, since the general environment of parts assembly is far more varied and complex than that of parts cutting, robot manipulators require programs that are more sophisticated than the simple geometric routines that suffice for numerically controlled machine tools.
From page 52...
... Joseph Engelberger, George Devol, and Maurice Dunn led the early technical work at Unimation and are still active today. An aspect of the early history worth noting is that Unimation was from the start a specialized company whose future wan strongly conditioned by the need to make a success of the robot manipulators they were developing.
From page 53...
... Other significant robotic research efforts were also undertaken at the Stanford Research Institute, which concentrated on problems of locomotion and computer vision, and at Edinburgh, where problems of computer vision, but more significantly some of the basic problems of robot assembly, were also studied. The work at SRI on a robot rover that navigated in a complex room environment became well-known and helped enlarge the general view of what robotic techniques might accomplish.
From page 54...
... Here the inventiveness of a very able group of mechanical engineers contributed an outstanding mechanical device, the so-called remote center compliance device, which made it possible for a robot manipulator to mate parts that had to be fitted to closer assembly tolerances than the maximum geometric precision of the manipulators themselves. NSF-funded work at the University of Rhode Island demonstrated that computer vision could be practically and successfully combined with robot manipulation.
From page 55...
... The importance of this Stanford work can be seen by contrasting the Stanford-descended manipulators with some of the other robot manipulators being sold today that still make use of more primitive software concepts that derive from pre-1970s research. For example, the Cincinnati-Milacron manipulators and some of the other robot manipulators being sold today still make use of more primitive software concepts that derive from pre-1970s research, and are programmed in a language reminiscent of the APT machine-tool programming language.
From page 56...
... To make a robot manipulator useful commercially involves a significant software effort, which must at the very least provide for real-time manipulator control, rapid handling of sensor-generated interrupts, and complex geometric computations. These requirements will increase significantly as multiarmed robot systems come to be employed.
From page 57...
... Tactile sensing plays a particularly important role in dextrous manual assembly. The subtlety of the human tactile sense is far from being matched by the relatively crude tactile sensors currently available with robot manipulators.
From page 58...
... Force-controlled motions play an essential role in manual assembly, and the demonstrated advantages of devices like the Draper Laboratories' remote center compliance device point clearly to their importance for robot manipulators as well. Near-term research and development efforts to make motion primitives of this kind available in the commonly used robot-programming languages are therefore likely.
From page 59...
... As the potential of robot technology is realized over the next few decades through the mastery of successive practical tasks, and as the cost of robot manipulators and their controls continues to fall, economic pressures will increasingly favor wide robot deployment. The immediate technical steps that will lead in this direction are the improvement of manipulators and grippers, the close study of numerous significant applications, and the development of improved sensors and their integration into standard systems.
From page 60...
... It is for this reason that the scientific computing community demands ever larger and more powerful machines.
From page 61...
... Thus the practitioner of scientific computation needs to be familiar with all the analytic paths that define the structure of the subject with which he is working, be it hydrodynamics, plasma physics, materials science, neurophysiology, or aircraft design. However, subject-area expertise is only one of the qualifications that the successful practitioner of scientific computing must have.
From page 62...
... m e fact that they need a high level of expertise in some external science means that most practitioners of large-scale computing are trained as physicists, chemists, physiologists, etc., rather than as computer scientists. It has generally proved easier for scientists to learn computing than for computer specialists to acquire these more mature sciences, perhaps because the numerical analysis required for scientific computing no longer forms any substantial part of the computer science curriculum, and the external science never did.
From page 63...
... Visits of this type furnish opportunities for numerical analysts to become aware of problems affecting major scientific computations, and for applications-oriented scientists to become aware of the latest advances in technique devised by the academic researchers. However, the frequency and breadth of this interaction are limited by the geographic remoteness of many of the large laboratory centers of scientific computing from the nearest strong university computer science departments.
From page 64...
... m e decentralized small computers that most of their software-oriented colleagues prefer may be useless toys for numerical analysts interested in the problems characteristic of large scientific computations. Even where a large computer is available, the scheduling mechanisms that evolve at an academic center may give decisive preference to small interactive jobs and relegate all substantial computations to overnight or weekend processing.
From page 65...
... The coming age of large-scale parallel computation, and the probability that specialized very-highperformance VLSI chips will need to be integrated into the scientific computing environment, will accentuate these difficulties. Moreover, even though researchers active in scientific computation at universities with powerful computer science departments may find it feasible to acquire bits and pieces of numerical analysis and of computer science by osmosis from colleagues as this information is needed, the link between computer science and research laboratories carrying out major scientific computations is growing weaker, and the link with industry weaker still.
From page 66...
... Two initiatives would help to alleviate the problems we have described. The first would be to accelerate and structure training in numerical scientific computing by establishing a few model training programs specifically oriented to this field at one or two universities with strong numerical analysis groups and a tradition of large-scale scientific computing.
From page 67...
... has focused on the problems characteristic of data processing. Very little of this research is classifiable as basic, and of the little research devoted to this field, even less can be counted as truly effective.
From page 68...
... As noted, research concerned with several subareas of data processing is not entirely absent. In particular, research in data base management systems has been pursued at a number of universities, although, as already noted, the initial identification of this whole field must be credited to industrial and commercial users.
From page 69...
... It is as though the research community has seized on the concept of data base management as an important issue worthy of study in its own right, and then proceeded to investigate it from purely aesthetic points of view having little to do with the problems that concern data base users. Increasingly ratified, much of this research has lost its vitalizing contact with the practical world of data processing that motivated data base systems in the first place.
From page 70...
... When research laboratories working in data processing become disconnected from the practical world of data processing and lose touch with practi tioners, who are both the source and the consumers of the researchers' general formulations, their work falls prey to an occupational hazard: it becomes artificial and irrelevant to practice. Such research can often be the most appealing "academically," in the narrow sense of precision and formality.
From page 71...
... Paper 9 1 SOFTWARE DEVELOPMENT THE PROBLEM The history of software development over the past 30 years has led to the gradual and painful realization that programming a computer is inherently difficult. m at realization is now deeply embedded in the consciousness of computer scientists and is one factor that separates the professional from the amateur programmer.
From page 72...
... 2. Interactive computing environments, starting with time sharing in the early 1960s, and leading to video editors and text processing in the 1970s, with the promise of widespread use of personal computer work stations, and integrated development environments in the 1980s.
From page 73...
... 73 hardware, which may appear at the time to be uneconomical. (Initially, time sharing was thought to be a wasteful use of resources, the original work on personal computer work stations was done when such work stations were too expensive to be routinely supplied to all programmers.)


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.