Skip to main content

Currently Skimming:

2 Challenges
Pages 11-18

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 11...
... These developments present new challenges for NSF as it seeks to understand the expanding requirements of the science and engineering community; explain the importance of a new broader range of advanced computing infrastructure to stakeholders, including those that set its b ­ udget; explore non-traditional approaches; and manage the advanced computing portfolio strategically. The Potential of Data-Intensive Computing for NSF Science and Engineering and the Corresponding Requirements Multiple fields (e.g., materials science)
From page 12...
... to more specialized architectures, such as hybrids of general-purpose processors and graphical processing units, which have a much more highly parallel structure, further exacerbates the challenges of aligning the workflows with available computing capabilities. TECHNOLOGY CHALLENGES A number of technology challenges will affect the ability of NSF and others to deliver the desired advanced computing capabilities to the science and engineering communities.
From page 13...
... For example, simple extrapolation of existing climate models to resolve processes such as cloud formation quickly lead to a computer that requires costly and possibly impractical amounts of electrical power. These challenges and the associated uncertainty pose significant challenges when contemplating future investment in extreme performance computers.
From page 14...
... An additional complication is that many important scientific data collections are not currently hosted in existing scientific computing centers. Recent advances in cloud data center design (including commodity processors and networks and virtualization of these resources)
From page 15...
... This may prove essential to opening NSF resources to use by new communities and enabling greater utilization. Co-location of computing and data will be an important aspect of these new environments, and such approaches may work best when the bulk of the data exchange can be kept inside a data center.
From page 16...
... Similarly, commercial cloud systems, while not an alternative for the kinds of applications that require tightly coupled capability systems, have massive aggregate computing and data-handling power. 3.  The committee will review data from NSF and the advanced computing programs it supports and seeks input, especially quantitative data, on the computing needs of individual research areas.
From page 17...
... that must be balanced against the opportunity cost, in terms of scientific productivity, in the conventional model of allocations and jobs queues. On the other hand, virtualization, the implied ability to migrate work, and limited oversubscription can work to decrease overall costs, increase overall system throughput, and increase the ability of the system to meet fluctuating workloads, although perhaps at the expense of the performance of an individual job.
From page 18...
... in order to use the allocation. 6.  The committee seeks comments on the challenges facing researchers in obtaining allocations of computing resources and suggestions for improving the allocation and review processes for making advanced computing resources available to the research community.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.