Skip to main content

Currently Skimming:

1 Problem Definition and History
Pages 9-14

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 9...
... It is generally accepted that computer modeling and simulation offer substantial opportunities for scientific breakthroughs that cannot otherwise -- using laboratory experiments, observations, or traditional theoretical investigations -- be realized. At many of the research frontiers discussed in this report, computational approaches are essential to continued progress and will play an integral and essential role in much of twenty-first century science and engineering.
From page 10...
... These investments would be for the development of mathematical models, algorithms, software, hardware, facilities, training, and support -- any and all of the foundations for progress that are unlikely to develop optimally without such investments given the career incentives that prevail in academe and the private sector. Given that the federal government has accepted this responsibility (see the section below on history)
From page 11...
... Much more recently, techniques such as neutron scattering, atomic force microscopy, and others have been built on a base of theory to enable investigations that were otherwise impossible. The fact that theory underpins these tools is key: Scientists needed a good The prefix "tera-" connotes a trillion, and "flop" is an acronym for "floating point operations." Los Alamos National Laboratory announced in June 2008 that it had achieved processing speeds of over 1 petaflop/s for one type of calculation; see the news release at http://www.lanl.gov/news/index.php/fuseaction/home.story/story_id/13602.
From page 12...
... This interaction between numerical algorithm design and mathematical theory for the underlying partial differential equations has continued since that time, leading to methods that make up the current state of the art in computational fluid dynamics today: high-resolution methods for hyperbolic conservation laws; projection methods and artificial compressibility methods for lowMach-number fluid flows; adaptive mesh refinement methods; and a variety of methods for representing sharp fronts. A similar connection can be seen in the development of computer infrastructure and computational science.
From page 13...
... Overcoming those complications often requires nonroutine knowledge of a broad range of disciplines, including computing, algorithms, data management, visualization, and so on. Therefore, because the individual investigator model might not suffice for HECC, mechanisms that enable teamwork are an infrastructure requirement for such work.
From page 14...
... at least one system in the 1-10 petaflops range that supports a more limited number of projects demanding the highest levels of computing performance. All NSF-deployed systems will be appropriately balanced and will include core computational hardware, local storage of sufficient capacity, and appropriate data analysis and visualization capabilities.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.