National Academies Press: OpenBook

The Future of Supercomputing: An Interim Report (2003)

Chapter: 6. Conclusion

« Previous: 5. The Role of Government in Supercomputing
Suggested Citation:"6. Conclusion." National Research Council. 2003. The Future of Supercomputing: An Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/10784.
×
Page 31
Suggested Citation:"6. Conclusion." National Research Council. 2003. The Future of Supercomputing: An Interim Report. Washington, DC: The National Academies Press. doi: 10.17226/10784.
×
Page 32

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

6 Conclusion The United States has always played a leadership role in bringing computing technology to bear on science and engineering research and development, advancing entirely new frontiers. The design of the earliest computers, for example Britain's Colossus and the U.S.'s ENIAC during World War If was driven by defense and national security needs. In the United States, ENIAC was followed by a steady stream of government-driven high-performance systems.2 Supercomputing continues to be important for satisfying those needs. The particular technical approaches of any program that develops or uses supercomputing represent a complex compromise between conflicting requirements and the risks and opportunities entailed in various approaches. As described earlier in this report, an assessment of the approaches requires a detailed understanding of (1) the applications, (2) the algorithms used to solve those applications, (3) codes and programming environments, (4) the performance of codes on various platforms, (5) the likely evolution of various hardware and software technologies under various Finding scenarios, and (6) the costs, probabilities, and risks involved in various approaches. In its final report, the committee will seek to characterize broadly the requirements of different application classes and to examine architecture, software, algorithm, and cost challenges and trade-offs associated with these application classes, keeping in mind the needs of the nuclear stockpile stewardship program, the broad science community, and the national security community. (Note that the identification of the distinct requirements of the stockpile stewardship program and its relation to the ASC acquisition strategy is expected to be the focus of a separate classified report by the JASONs). The committee believes it would be unwise to significantly redirect or reorient current supercomputing programs before careful scientific consideration has been given to the issues described above. Such changes might be hard to reverse, might reduce flexibility, and might increase costs in the future. Exciting opportunities to advance knowledge and to serve society using supercomputing continue to emerge. The life and health sciences are becoming extraordinarily data rich, and researchers in those sciences are struggling to make sense of the data. The unfits of the genome projects for physiology and medicine cannot be realized without significant investments in computational hardware, algorithms, the 'Colossus was designed to decrypt German codes. See http://www.codesandciphers.org.uk/lorenz/colossus.htm>. ENIAC (Electronic Numerical Integrator Analyzer and Computer) was built to calculate ballistic firing tables. See ~http://ftp.arl.mil/~mike/comphist/eniac-story.html>. 2For a discussion of early government-funded projects in the late 1 940s and 1 950s that essentially created the early U.S. computer industry, see Chapter 3 ("Military Roots") in Kenneth Flamm, 1988, Creating the Computer: Government, Industry, and High Technology, Washington, D.C.: Brookings Institution Press. 1

32 THE FUTURE OF SUPERCOMPUTING: ANINTERIMREPORT software infrastructure, and the human infrastructure. The understanding of the physical world enabled by simulation and modeling is reaching ever-higher levels of fidelity and timeliness. As in many other areas of technology R&D, there seem to be sound economic and social arguments for continued government investment in supercomputing. To sustain our leadership in supercomputing, to meet the security and defense needs of our nation, and to realize the opportunities to use supercomputing to advance knowledge, progress in supercomputing must go on. Continuity and stability in the government funding of supercomputing appear to be essential to the well-beina of supercomputing in the United States. An appropriate balance must be struck between evolutionary and innovative advances. Evolution is important because it allows present achievements to be exploited and because a diversity of approaches to supercomputing including refinements of existing approaches appears to be necessary to address the diversity of the computational challenges we face. Innovation in supercomputing stems from application- motivated research, which leads to experimentation and prototyping and then, in turn, to advanced development and testbeds and, finally, deployment and products. All the stages along that path need sustained investment. Coupled innovations in architecture, in software, in algorithms, and in application strategies and solution methods are equally important. Balance is also needed between exploiting cost- effective advances in widely used hardware and software and developing custom solutions that meet the most demanding needs. As we reach the limitations of current approaches and encounter the disruptions that are unavoidable when different technologies grow at different rates, the fruits of that research and its maturation into practice will prepare us for major paradigm shifts in the future.

Next: Appendix A: Committee Member and Staff Biographies »
The Future of Supercomputing: An Interim Report Get This Book
×
Buy Paperback | $47.00 Buy Ebook | $37.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The Committee on the Future of Supercomputing was tasked to assess prospects for supercomputing technology research and development in support of U.S. needs, to examine key elements of context--the history of supercomputing, the erosion of research investment, the changing nature of problems demanding supercomputing, and the needs of government agencies for supercomputing capabilities--and to assess options for progress. This interim report establishes context--including the history and current state of supercomputing, application requirements, technology evolution, the socioeconomic context--to identify some of the issues that may be explored in more depth in the second phase of the study.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!