Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Introduction Larry L. Smarr National Center for SupercomputingApplicaiions University of Illinois We are gathered together at a moment in which the country is going to have to make some critical decisions about its future, and many of the people attending will help make those decisions. I think the question we are all asking is, Why supercomputers? We certainly understand that the media like supercomputers. We read about supercomputing. We hear about it. And yet many of you here from corporations are wondering, Do I need to get involved in supercomputing? And if so, how? What are some of the reasons? It is not enough that supercomputers represent some of the most exciting technology today. Senator Gore in his keynote speech really put his finger on it. It is a technology the use of which by American industry may very well determine the future of the U.S. economy in the global economy. The constant references to the supercomputer as the steam engine of the information age or the machine tool of the l990s are shorthand attempts to capture that idea. And yet probably not more than 15 percent of the Fortune 500 companies own supercomputers, so supercomputers are not seen as being fundamental to all research, development, and manufacturing today. But if you are watching the trends, you will discover major changes happening in corporations embracing this technology. We have here to- day representatives from three of those trend-setting corporations Abbott Laboratories, Eastman Kodak Company, and Apple Computer, Inc. Listen to why they are using supercomputers and listen to the struggle they are 13
14 LARRY L. SMARR going through internally to get their people, their scientists, and their en- gineers to use these machines. What I have found and I have probably talked to several hundred industries in the last 2 years is that in many ways the resistance is coming primarily from the scientists and engineers themselves. As I have reflected on this phenomenon, I think that much of the difficulty goes back to the key role that universities play in their relationship to industry, as well as to the critical role that federal funding for science plays in determining the pace of scientific progress and the future of this country. Between l9?o and 1985 there was a period that I have referred to as the "supercomputer famine in American universities." During this period federal funds were cut off for placing advanced computing equipment in our universities. For these 15 years, university students and professors were not doing their research on supercomputers. For 15 years industry hired students from universities who did not bring those skills and attitudes into industry that would create a demand for supercomputing. Now our country has placed up to very high levels in industry a whole generation of scientists, engineers, and managers who have never used, seen, or cared about a Supercomputer From this point of view, it is not difficult to understand why there has been resistance to the use of supercomputers in industry and why America has missed its opportunity to take advantage of these machines and place them at the base of the American economy. Fortunately this situation is changing extremely rapidly because of the foresight of the National Science Foundation (NSF) and the Congress. They undertook an initiative in 1984 to set up a national program for providing supercomputing access to American universities. This consisted of creating national Supercomputer centers and beginning to build what Senator Gore referred to as the "superhighways of the information age." This proposed national network would hook together the Supercomputer centers with the research universities in the country, thereby coupling the personal computers or workstations on the investigators' desks with the remote supercomputers. That program now funds five Supercomputer centers. Doyle Knight, a member of the steering committee for this meeting, is the director of the John van Neumann Center in Princeton, and I am the director of the National Center for Supercomputing Applications (NCSA) at the University of Illinois; Sid Karin, the director of the San Diego Supercomputer Center, is also participating in this symposium. The other two centers are the Pittsburgh Supercomputer Center and the Cornell Supercomputer Facility. All of the American Supercomputer manufacturers (Cray Research, Inc., Control Data Corporation's ETA Systems (disbanded in April 1989), and IBM Corporation) have been represented in those centers. Three years
INTRODUCTION 15 ago, one had to leave the United States and go to Europe to get access to American-built supercomputers to do basic research. Now we have created a national infrastructure that, between the five centers, serves roughly 6000 scientists at 200 universities in the United States. Each scientist is allocated time through a peer review of proposed research to assure the quality of the projects. That is a rather miraculous discontinuous change by American time scales. What we are going to see now is that the graduates of these 200 universities will come to industries wanting to know where their supercomputers are to do their work. This will be similar to what we saw when engineers, who previously had used drafting tables, wanted to know where their CAD/CAM workstations were when they were hired. In a short period of time the whole notion of engineering CAD/CAM changed in America, and both productivity and the complexity of design increased. Through the NSF supercomputing centers, the federal government is providing universities with the kind of education and training necessary to bring new blood into the national pool of individuals trained in advanced computing. Unfortunately this does not directly help the vast current research community within industry that is not going to go through that process. What we need to do is to create additional structures to expose key industry people to the same opportunities, so that they develop the enthusiasm for advanced computing that we see among the bright young graduate students and professors in American universities. Each of the five NSF centers is pursuing industrial participation at their centers in different ways. One model will be discussed in this symposium when Cliff Perry talks of Kodak's participation in our center. Over 60 researchers from Kodak have come to the NCSA Interdisciplinary Research Center in the last 2 years to convert codes that run on ordinary computers without visualization to ones that work in a modern distributed environment of supercomputers networked to mainframes and workstations. They can work with some of the world's experts in visualization technologies to create visual interfaces to the massive numeric data fields they compute. The most important principle that I have learned as director of a center is that if this country is really going to meet this crisis and take advantage of this opportunity, teamwork is going to be the key idea. This means both the structural teamwork between the federal government, industry, and universities and the kind of teamwork we see in our Interdisciplinary Research Center between individual scientists, artists, computer scientists, and computer professionals working together as small teams to take on problems of enormous complexity. Teamwork is America's strong suit. It is what will pull us through this. A very good example of the results of teamwork can be seen in the art exhibit presented at this symposium by Donna Cox, an assistant professor
16 LARRY L 5~4RR of art and design at the University of Illinois and an adjunct professor at NCSA. She is one of the most innovative computer artists today. She creates her art by taking the numeric output of supercomputers, in areas of science, engineering, and mathematics, and then working with what she terms a "Renaissance team" an artist, a scientist, and a computer scientist to create beautiful visualizations. Scientists are able to see their results through this teamwork in ways that they, as individual scientists, would never have known how to do, and the visual result ends up as pure art that is being shown in galleries all over the United States, and now internationally. I think you could hear in Senator Gore's voice, as he gave his keynote address, a sense of urgency. I believe that he feels as many of us feel. I just came back from 12 days in Japan. If I felt an urgency before, I certainly feel a great deal more urgency now. This is very serious business. I think we have possibly a 1- or 2-year window as a country to take advantage of some of the lead we have in distributed computing, visualization, and our long tradition of using supercomputers in national laboratories. But it will not happen by the normal process of diffusion on the time scale that we usually use in this country. It must be something more than that. I think that making that extra effort is what Senator Gore was challenging us to do. The first session of this symposium, "The Changing Landscape of Supercomputer Technology," is a tutorial given by two of the leading experts on the technology of supercomputers. First, Jack Worlton will tale about the various kinds of architectures one finds in the supercomputer arena, how we have reached the point where we are today, and something about the computer industry itself. Jack is a lifetime laboratory fellow of Los Alamos National Laboratory. He has spent 30 years in the laboratory in a number of key positions. He is now president of Worlton and Associates, and he consults and lectures worldwide. He is probably the most sought-after speaker in the world today for teaching people about the technology of supercomputers. He is also, by the way, one of the world's experts on the actual details of U.S. and Japanese competition in this area. He will be followed by Steve Chen, who studied for his Ph.D. from the University of Illinois with David Kuck and represents one of the brilliant people who have come out of that long tradition at Illinois since World War II in architectures and software engineering for supercomputers. He went, after being at Floating Point Systems, to Cray Research, Inc. He was chief designer of the X-MP, which has been one of the best-selling supercomputers to date. He then became senior vice president at Cray Research, Inc. Recently, he moved on to become the president and chief
INTRODUCTION 17 executive officer of Supercomputer Systems, Inc. Steve will be talking about the technologies that we should be tracking in the next 5 years and that will make supercomputers even more super in the years to come. For the symposium's second session, "Existing Applications of Super- computers in Industry," we have selected three different corporations that are using supercomputers in rather different fashions to maintain and in- crease their competitive position in the world's marketplace. Of the three, Apple Computer, Inc. is the one that actually owns its own supercomputer. Eastman Kodak Company has access to supercomputing through the NCSA and is doing interesting work onsite at Kodak with different kinds of ma- chines, and Abbott Laboratories is in the process of deciding what to do about supercomputing. Our first speaker in the second session is Beverly Eccles, a group leader in computational chemistry at Abbott Laboratories who is on the project team evaluating supercomputers for Abbott. She obtained her Ph.D. in theoretical chemistry from the University of California at Irvine, and she has been at Abbott for 2 years. Previously she was at the Beckman Research Institute of the City of Hope, where she did image analysis. Cliff Perry, who represents Eastman Kodak Company in the second session, obtained his Ph.D. from Purdue University and then became a member of the faculty at the University of Minnesota. For the last 20 years he has been with Kodak in a variety of very interesting positions and was until recently the director of Kodak's Computational Science Laboratory. Kodak is probably one of the few corporations in America that has a computational science laboratory. Now he is the director of the Information and Computing Technologies Division, which oversees Kodak's Computational Science Laboratory. Larry Tester, whose degree is from Stanford University, will discuss supercomputer use at Apple Computer, Inc. He was a member of the leg- endary Xerox Palo Alto Research Center group, that troop of very excep- tional people who generated many of the modern ideas about workstation- human interfaces and hardware construction. He joined Apple in 1980 and is currently vice president for advanced technologies.