National Academies Press: OpenBook
« Previous: Front Matter
Suggested Citation:"Summary." National Research Council. 2007. Decadal Science Strategy Surveys: Report of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/11894.
×

Summary

The Workshop on Decadal Science Strategy Surveys was held on November 14-16, 2006, to promote discussions of the use of National Research Council (NRC) decadal surveys for developing and implementing scientific priorities, to review lessons learned from the most recent surveys, and to identify potential approaches for future surveys that can enhance their realism, utility, and endurance. The workshop involved approximately 60 participants from academia, industry, government, and the NRC. This report summarizes the workshop presentations, panel discussions, and general discussions on the use of decadal surveys for developing and implementing scientific priorities in astronomy and astrophysics, planetary science, solar and space physics, and Earth science.

WORKSHOP BACKGROUND

NRC decadal surveys provide broad assessments of the status of research fields, and they develop recommendations for scientific and programmatic priorities for future investments in the fields. Workshop participants from both inside and outside the government shared the view that the decadal surveys are important, especially because they have positive impacts on federal agency planning and decision making and on science community unity. Efforts by survey committees to draw wide community participation, engage in consensus building, and set explicit science-based priorities were repeatedly cited as features of the surveys that make them the gold standard for advice on research program planning.

While the surveys have been widely successful, government agencies and the scientific communities that have tried to follow survey advice have had to deal with several notable problems, including the following:

Suggested Citation:"Summary." National Research Council. 2007. Decadal Science Strategy Surveys: Report of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/11894.
×
  • Cost and technical risk. Survey estimates of program cost and technology readiness have sometimes proved to be overly optimistic or have not included the full life-cycle costs of initiatives.

  • Resiliency and execution. Surveys have not always provided guidance on how to respond to budgetary, programmatic, or policy changes that significantly impact survey recommendations. Nor have they addressed the impacts on a balanced portfolio of large, medium, and small projects when a large development project encounters cost and/or technical trouble.

  • Planning, management, and collaboration. The charges to survey committees have often been very broad and open ended, and the surveys themselves have not always been timed to match agency or political planning cycles. Survey committees have also encountered problems when research programs are not well coordinated between agencies or nations. Some ask whether survey users should have recommendations on broad science objectives or on specific missions or facilities. Recent surveys have not always explicitly reconsidered the recommendations of previous surveys.

LEVERAGING PAST SUCCESSES AND IMPROVING FUTURE SURVEYS

The workshop brought together subject experts, previous survey committee members, and a broad range of survey users over a 3-day period to discuss the pros and cons of various approaches of recent surveys to issues such as those noted above so that future surveys can handle these issues as effectively as possible. This rich discussion touched on each of the three problem areas noted above, with the views expressed by many participants highlighted below.

Cost and Technical Risk

Many participants noted that program cost estimates, which were based largely on information from the National Aeronautics and Space Administration, have become problematic and that too often costs have turned out to be as much as four times the original cost estimates. Participants urged that agencies continue to develop and improve cost and schedule parametric models for space missions. These models should reflect that software has become a dominant factor that impacts cost and schedule for many missions in ways just as important as traditional factors such as spacecraft mass or power. Some experts also noted that instrument development is a leading contributor to cost risk, so that programs for reducing risk related to instruments are also especially important.

In looking to future decadal surveys, participants appeared to agree that survey committees need to do four things: (1) include cost assessment and technology experts on survey committees, (2) obtain independent cost estimates and include cradle-to-grave life-cycle costs, (3) include cost uncertainty indexes to

Suggested Citation:"Summary." National Research Council. 2007. Decadal Science Strategy Surveys: Report of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/11894.
×

help define the risk of cost growth, and (4) use common costing approaches so that costs for different missions or facilities can be compared. The discussions repeatedly made the point that early cost estimates are better for comparing mission costs than they are for providing absolute cost projections, because most mission candidates are far short of being ready for a preliminary design review. Consequently, participants argued for the use of cost bins, or ranges, rather than specific costs. They also suggested that the technology being considered for a mission or facility needs to be well understood or characterized before a cost for it can be estimated.

Resiliency and Execution

Workshop participants acknowledged that unanticipated, but seemingly inevitable, changes in the budgetary, programmatic, and/or political environment present a challenge to the ability of government agencies and the research community to implement the priorities set in a survey report. They identified a number of steps that could enhance the resiliency of a survey’s recommendations and increase the likelihood that a recommended program could be executed as proposed. For example, participants argued that survey committees should establish metrics for creating and maintaining a balanced set of projects. Many speakers felt that survey committees need to recognize that proposing a range of missions or facilities (small, medium, and large, plus core research and technology activities) will be intrinsically more resilient than proposing mostly large, complex initiatives. They supported the idea that maximizing the number of projects to be selected competitively will enhance program resiliency.

In discussions about how to cope with a large increase in the cost of projects and the attendant impacts of such growth, some participants felt that survey committees need to be very conservative about recommending large missions that are not yet well defined or understood. As one speaker put it, “If a mission isn’t understood well enough to derive a good cost estimate, then it doesn’t deserve a priority.” Several experts argued that even after a presumably well-founded cost estimate is in hand, a reserve (~20 percent of the expected mission cost) must be held separately from the project manager’s mission or facility development budget contingency funds.

Finally, several participants noted that survey committees would be wise to start with a more realistic sense of agency budgetary and policy environments and to build stronger partnerships with agencies so that surveys can be more resilient.

Planning, Management, and Collaboration

Discussions about the timing of decadal surveys touched on both the time required to complete a survey and the time span over which a survey should look.

Suggested Citation:"Summary." National Research Council. 2007. Decadal Science Strategy Surveys: Report of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/11894.
×

There appeared to be broad agreement that 2 years is roughly appropriate for completing a comprehensive survey and that 10 years is about the right planning horizon. Agency representatives mentioned that they get plenty of conflicting advice, so that having a stable, long-term survey is very important. While there were also arguments that surveys should not be arbitrarily revised, it might make sense to build triggers into surveys, where cost growth or policy changes would require the survey to be revisited by a qualified group. A number of speakers suggested that scenario analysis be a part of any survey whose users want it to remain robust over a decade. Or, even better, that decision rules be included in the survey for dealing with unforeseen changes. Discussions of timing also drew suggestions that surveys need to be synchronized with other key planning processes (e.g., agency planning milestones and political cycles).

Several important factors for planning and organizing decadal surveys were mentioned repeatedly. First, former survey committee chairs noted that the survey charge must be clear and focused to avoid open-ended tasks and should be vetted fully with the research community and relevant government agencies. There was widespread agreement that surveys should have substantial community ownership and input. Some participants argued that all of the stakeholders (including the science community, federal agencies, and Congress) need to be part of the survey process (including definition, information gathering, and dissemination of results). A point that became clear in discussions of a survey’s assessment of cost, technology risk, and program execution was that survey committees need to include not only scientific disciplinary expertise, but also expertise in other areas such as hardware development, program management, systems engineering, cost estimating, and policy. There was also general agreement that survey planning should include how to disseminate the survey report to users and how to make it comprehensible and appealing to the public.

Workshop participants expressed support for the idea that surveys should remain focused on science first so that there would be a clear and compelling presentation of the important science to be done and that the subsequent presentation of programmatic priorities and recommendations should always be traceable back to the science. Speakers also agreed that it is important to highlight applications that can be drawn from basic science missions and that can benefit society in an immediate and tangible way. Finally, many participants noted that priorities recommended in previous surveys should be readdressed in the context of the new survey and its priorities, and recommendations in the previous surveys should not be assumed to be guaranteed or irreversible.

The workshop also stimulated discussion about several aspects of internal and external coordination. First, some participants acknowledged that while interagency and international cooperative programs have a chance of promoting cooperation across organizational boundaries, they tend to be a great challenge and rarely result in substantial cost savings. Therefore, agencies need to give extra attention to integrating such cooperative efforts as effectively as possible.

Suggested Citation:"Summary." National Research Council. 2007. Decadal Science Strategy Surveys: Report of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/11894.
×

Second, participants noted that as long as human exploration of space is a major national space goal, future surveys should not altogether ignore such exploration. However, science surveys should stick with the principle of “science first” while integrating research that can be enabled by human spaceflight into overall science priorities to be recommended in a survey report.

Suggested Citation:"Summary." National Research Council. 2007. Decadal Science Strategy Surveys: Report of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/11894.
×
Page 1
Suggested Citation:"Summary." National Research Council. 2007. Decadal Science Strategy Surveys: Report of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/11894.
×
Page 2
Suggested Citation:"Summary." National Research Council. 2007. Decadal Science Strategy Surveys: Report of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/11894.
×
Page 3
Suggested Citation:"Summary." National Research Council. 2007. Decadal Science Strategy Surveys: Report of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/11894.
×
Page 4
Suggested Citation:"Summary." National Research Council. 2007. Decadal Science Strategy Surveys: Report of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/11894.
×
Page 5
Next: 1 Introduction »
Decadal Science Strategy Surveys: Report of a Workshop Get This Book
×
Buy Paperback | $29.00 Buy Ebook | $23.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The Workshop on Decadal Science Strategy Surveys was held on November 14-16, 2006, to promote discussions of the use of National Research Council (NRC) decadal surveys for developing and implementing scientific priorities, to review lessons learned from the most recent surveys, and to identify potential approaches for future surveys that can enhance their realism, utility, and endurance.

The workshop involved approximately 60 participants from academia, industry, government, and the NRC. This report summarizes the workshop presentations, panel discussions, and general discussions on the use of decadal surveys for developing and implementing scientific priorities in astronomy and astrophysics, planetary science, solar and space physics, and Earth science. Decadal Science Strategy Surveys: Report of a Workshop summarizes the evnts of the three day workshop.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!