National Academies Press: OpenBook

The Space Science Decadal Surveys: Lessons Learned and Best Practices (2015)

Chapter: 4 Implementing the Decadal Survey

« Previous: 3 The Decadal Survey's Recommended Program
Suggested Citation:"4 Implementing the Decadal Survey." National Academies of Sciences, Engineering, and Medicine. 2015. The Space Science Decadal Surveys: Lessons Learned and Best Practices. Washington, DC: The National Academies Press. doi: 10.17226/21788.
×

4

Implementing the Decadal Survey

In this chapter, the committee explores the parts of the decadal process and activities following the survey that contribute to a successful implementation of the survey program. This includes elements of the program that support agencies and other government entities in the implementation of specific recommendations, as well as advice on what to do—in the opinion of the survey committee—in the case of changes in the assumptions that underlay the survey. Advisory structures have evolved at the National Academies of Sciences, Engineering, and Medicine1 and at the agencies that can provide tactical advice in executing the missions and furthering the activities of the recommend program, including an agency-sponsored, Academies-led midterm review to evaluate progress on the survey’s strategic goals. Opportunities for international and interagency collaboration are increasingly important for accomplishing the goals of decadal surveys, yet fostering and improving these collaborations remains a challenge.

DECISION RULES

The prioritized science goals of decadal surveys are developed with implicit and explicit assumptions about opportunity, cost, funding, schedule, and risk. A resilient strategic implementation plan accommodates some level of deviation from the anticipated circumstances. However, as time passes and conditions in the scientific, technical, political, and social environment change, executing the original plan may become difficult, impractical, or even counterproductive. While the specific challenges to be encountered may be unpredictable, considering alternatives to deal with likely perturbations can be useful. Recent decadal survey committees have been tasked with developing decision rules to help the agencies deal effectively with evolving reality.

Decision rules serve several purposes. First, simply by considering alternative scenarios, the survey committee can clarify its process for setting priorities. The prioritization process (described in Chapter 2) invariably involves trades among capability, cost, schedule, scope, project size, scientific balance, and risk. For example, considering the sensitivity of a priority program to cost helps both the writers and the recipients of the survey report to understand how and why choices have been made. When tactical decisions or implementation adjustments must be made, understanding the rationale is useful for agencies, policy makers, and future science advocates.

_______________

1 Activities of the National Research Council are now referred to as activities of the National Academies of Sciences, Engineering, and Medicine.

Suggested Citation:"4 Implementing the Decadal Survey." National Academies of Sciences, Engineering, and Medicine. 2015. The Space Science Decadal Surveys: Lessons Learned and Best Practices. Washington, DC: The National Academies Press. doi: 10.17226/21788.
×

Second, reality rarely matches expectations. When external priorities shift, swings in the national economic situation take place, scientific or technical advances occur, or management ground rules change, a plan developed with quite reasonable initial assumptions can become suboptimal, if not obsolete. To the extent that decision rules anticipate such changes, they can help preserve the relevance of the strategic goals, even if the original implementation can no longer be realized.

Decision rules also provide agencies and policy makers with flexibility and insight. Flexibility is important because tactical adjustments will be much more effective and more readily accepted by all parties when they are consistent with the community’s considered judgment as expressed in the decadal survey. Insight is valuable because, if expectations cannot be met or are exceeded, all of the stakeholders will better understand what will be substituted, sacrificed, or added.

Decision rules can be as simple as contingency planning, addressing, for example, what to do when an international partner makes a particular selection. Decision rules may also define alternative strategies in case a better or worse budget scenario comes to pass. In other situations, decision rules can provide on-ramps and off-ramps for priorities that depend on anticipated scientific discoveries or technical advances.

Decision rules are less helpful when attempting to proscribe responses to unlikely or unforeseen circumstances when tactical responses are more appropriate. Responding to disasters, such as mission failure, is best handled by the relevant agencies themselves, in consultation with the scientific advisory system. Similarly, decisions on narrower issues, including evaluation of specific design details or trade studies, can best be addressed in other ways.

Examples of Decision Rules

A formal request for decision rules in the decadal survey statement of task is a relatively recent addition that began with the 2010 astronomy and astrophysics decadal survey (Astro2010) and continued with the 2011 planetary science decadal survey (Planetary2011) and the 2013 solar and space physics decadal survey (Helio2013); the 2007 Earth science and applications from space decadal survey (Earth2007) also provided some decision rules. In each discipline, the decision rules have been different because of programmatic experience, technical feasibility, specific requests, community requirements, and discipline-specific culture.

Earth Science and Applications from Space

Earth2007 was not required to provide formal decision rules, but the survey report did provide a list of programmatic decision strategies and rules for three areas: leveraging international efforts, managing technology risk, and responding to budget pressures and shortfalls.2 The survey recognized that implementation of the Earth science program should be adjusted depending on the development of international programs, advances enabled by technology development programs, cost growth in prioritized missions, and changes to the Earth science budget. The list primarily contained strategic advice—for example, implementing a system-wide independent review process to inform decisions and accepting increased mission risk rather than descoping missions. However, two elements served as more explicit decision rules:

• Delay downstream missions in the event of small (~10 percent) cost growth in mission development. Protect the overarching observational program by canceling missions that substantially overrun.

• If necessary, eliminate specific missions related to a theme rather than whole themes.

Astronomy and Astrophysics

Astro2010 was the first survey officially tasked to consider decision rules. The survey committee’s approach was to develop an ordered, prioritized list of projects in each size category for the field as a whole, and to con-

_______________

2 National Research Council (NRC), Earth Science and Applications from Space: National Imperatives for the Next Decade and Beyond, The National Academies Press, Washington, D.C., 2007, p. 75.

Suggested Citation:"4 Implementing the Decadal Survey." National Academies of Sciences, Engineering, and Medicine. 2015. The Space Science Decadal Surveys: Lessons Learned and Best Practices. Washington, DC: The National Academies Press. doi: 10.17226/21788.
×

images

FIGURE 4.1. An artist’s concept of NASA’s proposed Wide-Field Infrared Survey Telescope. SOURCE: Courtesy of NASA Goddard Space Flight Center/CI Lab.

struct a baseline program that would fit the expected budget scenario provided by each agency. Decision rules, not necessarily identified as such in the report, were generally embedded in the discussion of recommended program elements. “Decision points” were identified for some major missions where future information about technology, cost, or programmatics might influence the ordering of projects. The committee in some cases went on to identify an alternative approach for achieving the science goals of the survey should less than the anticipated funding be provided. For NASA, the response to inadequate funding had often been to simply delay projects with postponed start dates, and in some cases to withdraw from a future international mission. A specific cadence was recommended for smaller competed lines, but the possibility of slowing the cadence in response to budget shortfalls was not explicitly indicated in the recommendation. The approach was similar for the National Science Foundation (NSF) and the Department of Energy (DOE). The decision rules of Astro2010 were developed as the impact of the great recession on federal spending was rapidly developing and before the true cost of the James Webb Space Telescope (JWST) was revealed. In each case, the available resources were significantly lower than anticipated, and to date, only the highest-priority activities and programs have been started. Examples of embedded decision rules and decision points include the following:3

Programmatic. The recommendation for the top-priority large space project included a statement that mounting a joint mission with the European Space Agency (ESA) “could be a positive development if it leads to timely execution of a program that fully supports all of the key science goals of WFIRST (Figure 4.1) and leads to savings overall” (pp. 207-208). The eventual outcome was only a moderate U.S. contribution to Euclid—a full collaboration would have been delayed by lack of available funding, mainly due to JWST cost growth.

_______________

3 NRC, New Worlds, New Horizons in Astronomy and Astrophysics, The National Academies Press, Washington, D.C., 2010.

Suggested Citation:"4 Implementing the Decadal Survey." National Academies of Sciences, Engineering, and Medicine. 2015. The Space Science Decadal Surveys: Lessons Learned and Best Practices. Washington, DC: The National Academies Press. doi: 10.17226/21788.
×

Scientific. The recommendation for the second-priority medium-sized space mission depended on “the combined space and ground-based program [being] successful in making a positive detection of B-modes from the epoch of inflation” (p. 217). Telescopes that are being designed and built have the capacity to make this discovery if the signal is as large as some theoretical expectations, so it is possible that this decision rule will soon come into effect.

Cost and technical. The fourth-priority large space mission, International X-ray Observatory (IXO), support required achieving sufficient technical progress to begin a mirror technology downselect. Moreover, IXO was to be descoped if the cost exceeded $2 billion. “Prior to a start, NASA . . . should ensure that IXO’s principal risks are retired . . . with sufficient maturation to demonstrate the performance, mass, and cost” (p. 215). In practice, this rule was moot on account of the JWST overrun and the evolution of the European X-ray program.

Planetary Science

The Planetary2011 survey committee benefited from the experience of Astro2010 and had better, but still overly optimistic, expectations about future budgets. The basic approach of Planetary2011 was to specify strict cost caps for the major recommended missions in order to maintain scientific balance. The committee went on to define compliant missions with less capability than initially proposed by the community. Specific mission decision rules depending on future selections in competed programs were also included. In the end, the survey recommend two flight programs for the decade: a somewhat optimistic scenario and a cost-constrained program matched to the budget guidance provided by NASA. By the time the report was published, even the cost-constrained budget proved to be too optimistic. The guidance provided to NASA for this scenario stated as follows:

It is also possible that the budget picture could be less favorable than the committee has assumed. If cuts to the program are necessary, the first approach should be to descope or delay flagship missions. Changes to the New Frontiers or Discovery programs should be considered only if adjustments to flagship missions cannot solve the problem. And high priority should be placed on preserving funding for research and analysis programs and for technology development.4

Solar and Space Physics (Heliophysics)

By the time of Helio2013, the full impact of the economic slow-down on the federal budget had been realized. While straining to include hints of optimism, Helio2013 adopted a conservative outlook and emphasized completing the existing program. The survey report5 recommended a pay-as-you-go approach that focused first on (1) modest changes to the science research program, (2) augmentation of the cadence of small, competed Heliophysics Explorer missions, and (3) initiation of larger directed missions in the Solar Terrestrial Probe (STP) and Living With a Star (LWS) programs. The survey also responded to specific requests for decision rules from NASA’s Heliophysics Division to address programmatic balance if the Solar Probe Plus mission grew significantly early in the decade. By this stage, an explicit list of decision rules addressed toward future cuts appeared in the summary:

Decision Rule 1. Missions in the STP and LWS lines should be reduced in scope or delayed to accomplish higher priorities. Chapter 6 gives explicit triggers for review of Solar Probe Plus.

Decision Rule 2. If further reductions are needed, the recommended increase in the cadence of Explorer missions should be scaled back, with the current cadence maintained as the minimum.

Decision Rule 3. If still further reductions are needed, the DRIVE [Diversify, Realize, Integrate, Venture, Educate initiative] augmentation profile should be delayed, with the current level of support for elements in the NASA research line maintained as the minimum.6

_______________

4 NRC, Vision and Voyages for Planetary Science in the Decade 2013-2022, The National Academies Press, Washington, D.C., 2011, p. 7.

5 NRC, Solar and Space Physics: A Science for a Technological Society, The National Academies Press, Washington, D.C., 2013.

6 NRC, Solar and Space Physics, 2013, p. 11.

Suggested Citation:"4 Implementing the Decadal Survey." National Academies of Sciences, Engineering, and Medicine. 2015. The Space Science Decadal Surveys: Lessons Learned and Best Practices. Washington, DC: The National Academies Press. doi: 10.17226/21788.
×

Helio2013 also indicated priorities for augmentations, but as of fiscal year (FY) 2015, Decision Rule 3 is operative.

Decision rules have become a more prominent feature of recent surveys because the recommended programs could not be executed with the available resources without significant deviations in scope and schedule; therefore, additional guidance was required. Decadal surveys are by nature ambitious statements of scientific goals: a better understanding of the relative importance of the goals is provided through explicit decision rules. Scientific disciplines differ—some must choose between destinations, while others rely on instruments serving large numbers of users; some require continuity or comprehensive coverage, while others attempt to address operational needs or focus on specific events or vantage points. The scientific cultures differ and so do the kinds of decision rules each discipline adopts. Each of the last four surveys also emphasized the importance of balance, both among competing scientific topics and among the various programmatic means used to achieve scientific goals. Balance is difficult to define, let alone maintain, when timing changes and when large and small programs must compete for constrained resources, but each of the disciplines has shown a strong commitment to the effort.

Lesson Learned: Decision rules provide deeper insight into the survey’s scientific priorities and reflect the wisdom and consensus of the scientific community.

Decision rules are written based on current understanding of future conditions and plausible alternatives. Not all eventualities can be anticipated, whether political, economic, technical, or scientific. Thus, care must be taken with decision rules as with all other survey recommendations, particularly as time passes. Midterm survey reviews provide an appropriate opportunity for reconsidering decision rules.

Decision rules have proven useful to NASA and other agencies, as well as to advisory committees and other stakeholders. Decision rules are particularly relevant to decisions about the scope and timing of new missions and large facilities and programs.

Although only “downside” rules have been implemented so far, having rules for positive developments, such as budget increases, is also important.

Best Practice: Decision rules ordinarily are best when strategic in nature rather than tactical. The objective is to provide insight into how science priorities evolve or change under specific circumstances without over-constraining implementation. Long-term advice that advances the scientific goals of the community is useful, whereas short-term rules quickly become obsolete or are better determined by administrators, policy makers, and community members familiar with the immediate situation.

Best Practice: The best decision rules are clear, unambiguous, and easy to implement without being overly prescriptive. Decision rules are best designed to support the achievement of survey priorities, without overly specifying the means by which they are achieved. Clear if-then rules are more useful, whereas decision rules that require interpretation may often be less helpful. For example, references to general concepts (like balance) are difficult to understand, implement, and evaluate without specific definition and guidance.

Best Practice: Decision rules need to be clearly identified and labeled as such in the survey report. In order to facilitate clear communication, all of the decision rules from the entire survey should be collected in one section, with each being traceable back to the body of the report.

Best Practice: Decision rules can be evaluated as part of the midterm assessment process. This may include reviewing existing decision rules, assessing their status, and considering whether any specific decision rule requires reevaluation due to emerging circumstances.

Whether or not decision rules are utilized depends on events that occur well after the survey is completed, but decision rules are a legacy from the decadal survey that will help in understanding and implementing the decadal survey’s science program.

Suggested Citation:"4 Implementing the Decadal Survey." National Academies of Sciences, Engineering, and Medicine. 2015. The Space Science Decadal Surveys: Lessons Learned and Best Practices. Washington, DC: The National Academies Press. doi: 10.17226/21788.
×

STEWARDSHIP

Role of SSB Standing Committees

Strategic versus Tactical Advice

There are currently two distinct entities that advise NASA Science Mission Directorate (SMD) and its four divisions: the Space Studies Board (SSB) of the Academies’ and the NASA Advisory Council (NAC). Strategic advice pertaining to the forward vision for NASA programs and activities and evaluation of program performance against long-term planning metrics is generally provided through activities organized under the auspices of the SSB and its discipline-specific standing committees.7 The NAC and its committees and subcommittees generally provide tactical advice that addresses ongoing activities where scientific input is needed to ensure effective and efficient performance for an active program or mission.

Strategic Advice from the Academies’ Space Studies Board

Long-term strategic advice for NASA SMD comes from the scientific community through its representatives on the SSB’s ad hoc committees and the convening power and expertise of the SSB’s standing committees. In particular, the ad hoc committees requested by NASA and other appropriate federal agencies to produce the decadal surveys address the science of each of the four SMD divisions. Decadal survey committees and their subdiscipline panels solicit broad input from scientific and stakeholder communities, scientific society groups, and the analysis/assessment groups that provide input into the NAC. The decadal survey committees, like other ad hoc committees of the SSB, produce documents with findings and recommendations based on the consensus of the committee members in response to an approved statement of task. These findings and recommendations are provided as advice to NASA and other sponsoring agencies. Often they are regarded as only part of the input into the decision-making process. Nonetheless, both Congress and the administration take these reports seriously, and, indeed, NASA’s authorizing legislation calls on NASA to “take into account the current decadal surveys from the National Academies’ Space Studies Board when submitting the President’s budget request to the Congress” (P.L. 111-267).

Standing Up the Standing Committees

It has been traditional for the SSB standing committees of a particular discipline to “stand down” during the 2 years or more that its new decadal survey is in progress. This began early in the history of decadal surveys and is thought to have resulted from a concern that federal agencies might receive contradictory input from a discipline’s decadal survey and its SSB standing committee’s activities. It also reduces the workload for the community and the cost to the agencies. The actual benefits of this practice are uncertain, but the disadvantage is clear: a hiatus of 2 years or more during which NASA and other agencies are unable to engage with a standing committee on implementation of the previous decadal survey and any other time-sensitive issues that may arise.

The inability of the Academies to routinely organize activities to provide timely advice because the relevant standing committee has stood down is a serious problem. The events in the months following the release the Astro2010 report, New Worlds, New Horizons in Astronomy and Astrophysics,8 provide a telling example. The survey report was released in August 2010, but, for a variety of reasons, the relevant standing committee—the Committee on Astronomy and Astrophysics (CAA)—was not reestablished until February 2012. So, for a period of 18 months, there was no appropriately constituted interface between the Academies and the agencies implementing Astro2010’s recommendations. However, within weeks of the release of the Astro2010 survey report, the Office of Science and Technology Policy (OSTP) requested input from the Academies concerning several astrophysical

_______________

7 In addition to the Space Studies Board, the Academies’ Board on Physics and Astronomy collaborates with the SSB on studies and activities related to astronomy and astrophysics, and several boards of the Academies’ Division of Earth and Life Sciences collaborate with the SSB on studies and activities related to Earth science and applications from space.

8 NRC, New Worlds, New Horizons, 2010.

Suggested Citation:"4 Implementing the Decadal Survey." National Academies of Sciences, Engineering, and Medicine. 2015. The Space Science Decadal Surveys: Lessons Learned and Best Practices. Washington, DC: The National Academies Press. doi: 10.17226/21788.
×

issues, including synergies and complementarities between the science goals of NASA’s Wide-Field Infrared Space Telescope (WFIRST) and ESA’s Euclid dark-energy mission.9 Similarly, a year later, NASA asked the Academies’ to review its proposed plans for participation in Euclid.10 The absence of CAA’s expertise, convening power, and ability to organize activities significantly complicated the Academies’ response to these two requests. Both tasks were ultimately handled expeditiously, due, in large part, to a fortuitous combination of circumstances. However, a price was paid in terms of continuity of assistance that CAA and its associated advisory activities could have provided.

Lesson Learned: As long as the standing committee restricts its work to the current program, there is no meaningful conflict that would preclude continuation of the SSB standing committees during the execution of a decadal survey.

Best Practice: SSB standing committees can continue their work throughout the period when a new decadal survey is in progress in order to provide an uninterrupted channel of communication between these committees and NASA and other agencies, with respect to strategic issues that concern the current program.

NASA Advisory Structure

The NAC, a committee reporting to the NASA Administrator and chartered under the Federal Advisory Committee Act (FACA), provides rapid tactical advice to NASA SMD and its divisions and performs annual evaluation of performance of individual programs and projects—including space science missions. Reporting to the NAC are committees representing the various NASA directorates. In the case of SMD, the appropriate advisory group is the NAC Science Committee.

While the activities of the NAC Science Committee follow FACA guidelines, it is not officially a FACA committee and cannot provide official advice to NASA in the form of findings or recommendations, except through the NAC. Thus, the Science Committee does not officially advise the associate administrator (AA) for SMD except by providing input to the NAC that is then passed either through the NASA Administrator and back down to the AA or, more recently, directly to the AA with the NAC’s concurrence. Unofficially, the AA or deputy AA may participate in the Science Committee meetings, allowing unofficial communication of the committee’s sentiment at the directorate level. Similarly, the subcommittees of the NAC Science Committee that represent the four SMD disciplines cannot provide official advice to the appropriate division directors, except through the Science Committee and the NAC and through their oversight of the triennial division roadmaps. As the sole representative of the Science Committee and its subcommittees, the chair of the Science Committee is the singular voice charged with transmitting all science input (with all the nuances involved) to the NAC. Because the various committees and subcommittees of the NAC meet only every few months, official tactical advice at the divisional level is rarely rapid, and discipline-level issues do not often qualify as sufficiently important to justify attention from the NAC during its meetings. This situation is additionally compounded by the more frequent use of virtual meetings, which are generally less effective as a forum of discussion leading to the development of consensus advice across subdisciplines.

Short-Term Guidance

Neither the SSB and its standing committees nor the NAC Science Committee and its subcommittees are truly empowered to provide timely advice or input to NASA associate administrator for SMD or the subordinate division directors when short-term tactical advice on strategic programs is needed. While NASA can request advice from the community through the SSB in the form of a report from an ad hoc committee on an issue of

_______________

9 For details see NRC, Report of the Panel on Implementing Recommendations from the New Worlds, New Horizons Decadal Survey, The National Academies Press, Washington, D.C., 2012.

10 For details see, NRC, Assessment of a Plan for U.S. Participation in Euclid, The National Academies Press, Washington, D.C., 2012.

Suggested Citation:"4 Implementing the Decadal Survey." National Academies of Sciences, Engineering, and Medicine. 2015. The Space Science Decadal Surveys: Lessons Learned and Best Practices. Washington, DC: The National Academies Press. doi: 10.17226/21788.
×

importance, such activities can take significant time to perform. The NAC’s Science Committee, because it lacks a FACA charter, is also ill-equipped to provide formal advice. NASA-supported community groups, such as the discipline-specific assessment or analysis groups within the Planetary Science Division (see below), may be more facile, yet these are not FACA groups, so their input must be fed through the NAC structure to conform to legal requirements. Congress may also solicit input on decadal survey science issues from members of the SSB or its standing committees in the form of participation in hearings. However, there are few ways for any community group to be a proactive guardian or steward of a decadal survey should Congress, the administration, or the agency elect to deviate significantly from a decadal survey’s recommended program.

In addition to the NAC and its associated committees, NASA maintains a number of assessment and analysis groups that function as a link between the science community and the discipline scientists at NASA Headquarters. While these groups do not recommend or advise, they do comment on strategic planning and science priorities and provide input on the health of their subdiscipline in the context of NASA missions. For example, COPAG (Cosmic Origins Program Assessment Group) has been focusing on full utilization of the general-purpose observatory JWST, the first-priority space mission for the 2001 astronomy and astrophysics decadal survey,11 and on the planning for WFIRST, Astro2010’s first priority.

Similarly, the planetary assessment and analysis groups represent many of the field’s subdisciplines—MEPAG (Mars Exploration), OPAG (Outer Planets), SBAG (small solar system bodies), VEXAG (Venus Exploration), LEAG (Lunar Exploration), and CAPTEM (sample analysis).12 Their activities concern longer-term strategic planning—mission scenarios and science goals and priorities that evolve in response to new missions and observations. At one time, the chair or a representative of each of these groups served on the NAC Planetary Science Subcommittee, which provided “findings” to the NAC Science Committee on issues that were more tactical in nature. This practice of cross-membership has been discontinued.

NASA’s Heliophysics Division has MOWGs (management operations working groups), informal groups that help the discipline scientists maintain close relationships with the solar and space physics community. Among other things, the MOWGs provide perspectives on strategic planning, methodologies, mission concepts, and initiatives. There are three MOWGs—the SH-MOWG, the G-MOWG, and the LWS-MOWG (solar-heliosphere, geospace, and Living With a Star, respectively). Like the assessment and analysis groups, MOWGs do not provide formal advice but do pass findings on to the Heliophysics Subcommittee of the NAC Science Committee.

Lesson Learned: The current advisory structure does not provide an effective mechanism for short-term tactical guidance from the scientific community (i.e., tactical guidance for accomplishing strategic visions).

While SSB’s standing committees have a clear connection to the appropriate NAC science subcommittees, enabling cross-communication, such connections with the advisory structures of the other federal agencies involved in space science activities are often less well established. As such, effective stewardship of decadal survey recommendations in these agencies may be less effective than with NASA.

When an agency has a need for the Academies’ advice, the appropriate standing committee would work with it to draft a statement of task. Subsequently, the Academies would appoint an ad hoc committee consisting of appropriate members of the standing committee, augmented by other experts as needed depending on the nature of the specific request.

Best Practice: NASA division directors and program officers for other interested agencies (e.g., NSF, the National Oceanic and Atmospheric Administration [NOAA], the U.S. Geological Survey [USGS]) can work with the SSB’s standing committees to commission letter reports, meetings of experts, or workshops when specific advice is needed on a more rapid turnaround basis.

_______________

11 NRC, Astronomy and Astrophysics in the New Millennium, National Academy Press, Washington, D.C., 2001.

12 Additional information about these and other assessment and analysis groups can be found at Lunar and Planetary Institute, “NASA Advisory, Analysis and Assessment Groups and Resources,” last updated November 18, 2014, http://www.lpi.usra.edu/analysis/.

Suggested Citation:"4 Implementing the Decadal Survey." National Academies of Sciences, Engineering, and Medicine. 2015. The Space Science Decadal Surveys: Lessons Learned and Best Practices. Washington, DC: The National Academies Press. doi: 10.17226/21788.
×

Tactical Advice from the AAAC

The Astronomy and Astrophysics Advisory Committee (AAAC) is a FACA-chartered committee resulting from one of the primary recommendations of the 2001 report U.S. Astronomy and Astrophysics: Managing an Integrated Program.13 The relevance of the AAAC to stewardship of decadal surveys lies in two components of the AAAC charter: (1) to assess, and make recommendations regarding coordination of astronomy and astrophysics programs of NSF, NASA, and DOE and (2) to assess, and make recommendations regarding the status of activities of NSF, NASA, and DOE as they relate to recommendations contained in the decadal survey reports. The AAAC findings and recommendations are communicated in letter reports directly to the NASA Administrator, the NSF Director, the Secretary of Energy, and the chairs of various congressional committee and members on Capitol Hill.

While the AAAC has a role as shepherd over the decadal survey results, it is charged with providing tactical advice on project and program development, particularly for those that involve more than one agency.14 Since Astro2010, AAAC deliberations have including issues surrounding development of JWST, international cooperation in mission development (Euclid/WFIRST), heliophysics (the Daniel K. Inouye Solar Telescope—NASA mission synergies), and plutonium-238 production restart for deep space missions, as well as technology development, workforce issues, and midterm survey review. The AAAC has repeatedly emphasized the need for portfolio balance among small and medium grants, projects and facilities, and programs across different disciplines—reiterating common decadal survey themes.

Dialogue between the AAAC, the standing committees of the SSB, and other stakeholders is important for effective stewardship of the survey. Unanimity of voice when in agreement, and clarity on differences when in disagreement, is important to provide agencies, Congress, the executive branch, and the scientific community with continuity as implementation of the decadal survey progresses. In addition, the AAAC can provide a vehicle for more prompt responses to tactical challenges faced by the disciplines and agencies in advancing priorities in astronomy and astrophysics.

Lesson Learned: The AAAC can play an important and unique role in stewardship of a decadal survey through its focus on interagency cooperation.

Midterm Reviews

The 2005 NASA Authorization Act (P.L. 109-155) requires that “the performance of each division in the Science directorate of NASA shall be reviewed and assessed by the National Academy of Sciences at 5-year intervals.” Thus, approximately 5 years into a decadal survey’s implementation period, a study is requested by NASA and an ad hoc committee is appointed by the Academies to prepare and release a midterm assessment. The midterm assessment details the progress made on decadal survey priorities to date and provides a chance for the scientific community to formally recommend any course corrections needed in response to programmatic or budgetary changes since the release of the decadal survey. Science priorities are typically not revisited at the time of a midterm assessment. However, changes in program implementation to better align with decadal survey recommendations may be recommended.

The structure and content of midterm assessment reports has varied among the disciplines, as driven by their statements of task (see Box 4.1), but most include a survey of recent scientific progress as well as an assessment of the implementation of decadal survey recommendations.

Midterm assessment committees need to consider how best to convey a summary of the program’s progress while remaining mindful of the wide range of audiences and uses for the report. The most recent planetary science15 and

_______________

13 NRC, U.S. Astronomy and Astrophysics: Managing an Integrated Program, National Academy Press, Washington, D.C., 2001.

14 Astronomy and Astrophysics Advisory Committee, Report of the Astronomy and Astrophysics Advisory Committee, letter report, March 15, 2011, p. 7, http://www.nsf.gov/mps/ast/aaac/reports/annual/aaac_2011_report.pdf.

15 National Research Council (NRC), Grading NASA’s Solar System Exploration Program: A Midterm Review, The National Academies Press, Washington, D.C., 2008.

Suggested Citation:"4 Implementing the Decadal Survey." National Academies of Sciences, Engineering, and Medicine. 2015. The Space Science Decadal Surveys: Lessons Learned and Best Practices. Washington, DC: The National Academies Press. doi: 10.17226/21788.
×

BOX 4.1
Midterm Reviews

Midterm assessments (or mid-decadal reviews) are mandated by Congress to occur at 5-year intervals as specified by the 2005 NASA Authorization Act (P.L. 109-155). To date, four midterm assessments have been completed.1

The statements of task for each of these midterm assessments share several common features. Each was charged to

• Assess how well NASA’s current program addresses the “strategies, goals, and priorities” outlined in the decadal survey and other National Research Council reports;

• Assess “Progress toward realizing these strategies, goals, and priorities”;

• Identify any actions that could be taken to maximize the science return/optimize the scientific value “in the context of current and forecasted resources”; and

• Provide guidance about implementing the recommended mission portfolio in preparation for the next decadal survey—but “not revisit or alter the scientific priorities of mission recommendations.”

The next astronomy and astrophysics midterm is currently under way. Its statement of task includes two additional items that expand its scope compared to the previous round. In particular, the 2015 midterm is charged to

• Describe the most significant scientific discoveries, technical advances, and relevant programmatic changes in astronomy and astrophysics over the years since the publication of the decadal survey; and

• Review the strategic advice provided for the agencies’ programs by federal advisory committees.

_______________

1 See the National Research Council reports Earth Science and Applications from Space: A Midterm Assessment of NASA’s Implementation of the Decadal Survey (2012); A Performance Assessment of NASA’s Heliophysics Program (2009); Grading NASA’s Solar System Exploration Program: A Midterm Review (2007), A Performance Assessment of NASA’s Astrophysics Program (2007), all published by the National Academies Press, Washington, D.C.

solar and space physics (heliophysics)16 midterm assessments used a grading system to convey relative progress on specific recommendations. Grading systems have the advantage of being relatively straightforward to implement and readily provide a logical structure for the report. However, grades also can be easily misused. For example, is a low grade cause to increase investment in an area, or a reason to cut its funding? Including explanatory text that specifies the desired response is helpful. However, it does not ensure the grades or scores will not be used out of context. The astronomy and astrophysics17 and Earth science and applications from space18 midterm assessment committees chose not to use a grade/scoring system and instead provided specific summary findings to indicate the status of each program element (see Table 4.1).

Lesson Learned: Providing scores or grades in a midterm assessment report can result in unintended consequences when used by a wide audience. When grades are used, it is best if narrative text clearly indicates the desired response to a good or bad grade.

_______________

16 NRC, A Performance Assessment of NASA’s Heliophysics Program, The National Academies Press, Washington, D.C., 2009.

17 NRC, A Performance Assessment of NASA’s Astrophysics Program, The National Academies Press, Washington, D.C., 2007.

18 NRC, Earth Science and Applications from Space: A Midterm Assessment of NASA’s Implementation of the Decadal Survey, The National Academies Press, Washington, D.C., 2012.

Suggested Citation:"4 Implementing the Decadal Survey." National Academies of Sciences, Engineering, and Medicine. 2015. The Space Science Decadal Surveys: Lessons Learned and Best Practices. Washington, DC: The National Academies Press. doi: 10.17226/21788.
×

TABLE 4.1 Gross Characteristics of the Four Midterm Decadal Reviews Completed So Far


  Earth Science and Applications from Space Solar and Space Physics (Heliophysics) Planetary Science Astronomy and Astrophysics

Scientific progress since last decadal Yes Yes Yes Yes
Program/project implementation status Yes Yes Yes Yes
Technology development status Yes Yes Yes No
Description of challenges to implementation Yes Yes Yes Yes
Budget history Yes Yes No Yes
Used grades/scores No Yes Yes No

In the course of its work, a midterm assessment committee is typically briefed by the agencies involved and other important stakeholders (e.g., congressional representatives, OSTP, the Office of Management and Budget [OMB]). This provides insight into how the program of record was developed. The midterm assessment report provides an important opportunity to share this contextual information with the broader community. The first Earth science midterm assessment,19 for example, took the approach of dissecting what happened since Earth2007 to try to elevate the community discussion beyond anecdotal concerns and to explicitly discuss the many reasons that implementation was not proceeding as envisioned. The report detailed how Earth2007 priorities were convolved with the administration’s priorities to develop a so-called “climate-centric architecture,” which was then considered the operating plan for SMD’s Earth Science Division.20

The Earth science midterm report21 went on to detail how the funding for that plan was subsequently reduced, resulting in slower than expected progress on decadal survey priorities. The report called out missions by name, explained what happened and why, and discussed what worked well and what did not. Key figures from Earth2007 (e.g., budget profiles, missions in operation) were updated to provide the community with a sense of how resources had (or had not) changed since the survey’s release. The midterm assessment thus serves an important role in making the broader community aware of the implementation plan and how it evolved from the decadal survey’s priorities.

Best Practice: Midterm assessment reports are most useful when they engage and inform the broad community by providing a progress report on implementing the decadal program, together with sufficient context to understand the rationale behind the program’s current implementation strategy.

While midterm assessments cannot solicit community input in the same way as the decadal survey process, the assessment committee members are selected to represent the breadth of the disciplinary community. Additionally, they provide expertise across science, missions, technology, and supporting infrastructure, allowing a complete assessment of the division’s performance over the approximately half-decade since the decadal survey. Corrective actions for underperforming areas and evolving scientific foci are identified in the report. In this way, midterm assessments are both a review of past performance and a document to guide activities in the following half-decade. In that sense, they act as the first step in the process for the next decadal survey and can be used to guide the community in the development of new concepts for activities in the following decade. Such concepts can then be developed and matured over the subsequent 5 years as input into the next decadal process, with consequent improvement in efficiencies.

_______________

19 NRC, Earth Science and Applications from Space, 2012.

20 NASA, Responding to the Challenge of Climate and Environmental Change: NASA’s Plan for a Climate-Centric Architecture for Earth Observations and Applications from Space, June 2010, http://science.nasa.gov/media/medialibrary/2010/07/01/Climate_Architecture_Final.pdf.

21 NRC, Earth Science and Applications from Space, 2012.

Suggested Citation:"4 Implementing the Decadal Survey." National Academies of Sciences, Engineering, and Medicine. 2015. The Space Science Decadal Surveys: Lessons Learned and Best Practices. Washington, DC: The National Academies Press. doi: 10.17226/21788.
×

Lesson Learned: Midterm assessments offer an opportunity to initiate the process of concept development for the next decadal survey. While primarily serving to provide a report on progress to date, the assessments can also act as forward-looking documents in the preparation for future decadal planning.

INTERNATIONAL ACTIVITIES

There is strong consensus about the importance of international cooperation in space missions as having both potential and often very concrete benefits to the participants. However, effectively engaging international participants at both individual and institutional levels in decadal surveys remains challenging. The global economic situation today begs for avoiding duplication of efforts and, instead, combining means and human resources for large and medium-sized science projects. There are many examples where collaboration has been critical to mission success. In planetary science, for example, the Cassini-Huygens mission developed as a very successful collaboration among ESA, NASA, and the Agenzia Spaziale Italiana. Other examples include the following: TOPEX/POSEIDON, ASTER, and a succession of ocean color measurements in Earth science; SOHO, Cluster, Hinode, and Yohkoh in heliophysics; and HST, Hayabusa, Planck, Herschel, and SOFIA in astrophysics. Ground-based programs, such as ALMA, also demonstrate how fruitful international collaboration can be.

At the 2012 workshop’s session on “Incorporating International Perspectives into Future Decadal Planning,” NASA’s Dennis McSweeney stated, “International cooperation is a founding principle of NASA.”22 He went on to list the five guidelines established in the 1950s, among them, (1) NASA looks for specific programs, rather than general science, upon which to build collaborations; (2) joint projects must be of mutual scientific interest; and (3) each partner accepts financial responsibility for its contribution. Today, NASA SMD, alone, maintains more than 4,000 cooperative agreements with 100 nations, with some 50 active international collaborations.

The benefits of international collaborations, McSweeny said, come from many directions. Coordination of individual missions and/or mission architectures (e.g., orbital crossing times); coordination of different observations (e.g., ocean color); and provision of instruments (e.g., ASTER), spacecraft, or launch capabilities are all familiar examples. All scientific communities, especially heliophysics and Earth science, have embraced a large range of international collaborations. NASA is already sophisticated and motivated in its approach to collaborations that increase the science return.23 Specifically, international collaborations have the potential to bring on board assets such as:

• Optimizing the mission definition,

• Enabling missions that could not be afforded by individual agencies,

• Enhancing mission capabilities,

• Accelerating mission implementation,

• Reducing mission cost in some tangible way to one or more partners, and

• Strengthening the national commitment to the mission.

The challenge for the decadal survey process is to determine how opportunities for international partnerships can be appropriately introduced, discussed, and prioritized. Coordinating groups exist in some areas—for example, the World Meteorological Organization, the Committee on Earth Observing Satellites, and the Committee on Space Research (COSPAR).

Interagency/Inter-Governmental Relationships

Although international cooperation is a founding principle of NASA, there is a difference between NASA’s relationships with individual national space agencies and multinational organizations, such as ESA. For example,

_______________

22 NRC, Lessons Learned in the Decadal Planning in Space Science: Summary of a Workshop, The National Academies Press, Washington, D.C., 2013, p. 62.

23 NRC, Lessons Learned in the Decadal Planning in Space Science, 2013, pp. 62-63.

Suggested Citation:"4 Implementing the Decadal Survey." National Academies of Sciences, Engineering, and Medicine. 2015. The Space Science Decadal Surveys: Lessons Learned and Best Practices. Washington, DC: The National Academies Press. doi: 10.17226/21788.
×

NASA’s Earth Science Division has a history of strong collaborations with individual national space agencies, but less so with ESA. Other scientific disciplines within NASA have different approaches, but overall the agency seeks scientific/mission cooperation quite broadly.

In addition, decadal surveys will have to recognize the increasing diversity of the world’s space community. Many national space agencies have achieved significant successes in space activities (e.g., those of China, India, Brazil, and Russia) beyond those routinely involved in collaborations with the United States (e.g., those of Japan, Canada, and Europe). However, while the number of nations with considerable capability in space missions continues to grow, there has been little progress in developing ways to coordinate international planning, align goals of different players, and assure complementary programs—at least instead of competitive ones. There are many essential benefits to collaborating internationally; however, there are also some very real barriers. Impediments to international collaboration include, but are not limited to, the following:

• Mission selection processes that may be asynchronous and have substantial differences.

• Difficulty in securing commitment to a joint project—Who will commit first to a program that one nation cannot accomplish on its own?

• Technologies are often proprietary and not easily shared (e.g., issues associated with the International Trafficking in Arms Regulations); citing D. Southwood, ESA’s former director of science and robotic exploration, “European scientists are very reluctant to become involved in hardware exchange because of the economic and political sensitivities of Earth remote-sensing technologies.”

• Differences in data policy.

• Community building and mission-concept development processes that may vary greatly over the world’s space agencies.

• Varying planning processes.

• Different relationships between agencies and their governments, in particular in terms of commitments to funding or the cancellation of existing commitments.

• Concerns about security and sharing of resources—for instance, the security requirements for launching missions using nuclear power sources from Europe on a European launcher, and vice-versa (now that Europe is developing its own radioisotope power systems based on americium-243).

• Organizational communication and managerial issues.

• International politics.

• Cost evaluation of foreign contribution.

• Impediments to U.S. participation in foreign meetings, given current federal restrictions on travel and conference attendance.

A specific example in the field of heliophysics illustrates the difficulty and complexity of putting together an international mission in connection with a decadal survey. The Japan Aerospace Exploration Agency (JAXA) was in the process of selecting their next solar mission during the early and middle phase of the solar and space physics (heliophysics) decadal survey. JAXA ultimately selected a high-resolution telescope to observe small-scale features on the Sun. The Solar-C mission crucially required a $100 million to $200 million contribution for a U.S. instrument, but at that point the decadal survey committee was unwilling to commit. Science considerations aside, this was partly because of uncertainty with the cost and technical evaluation (CATE) process for a relatively small U.S. contribution to an international mission, partly because of the timing relative to the establishment of priorities by the panels, and partly because of funding constraints early in the decade. The decadal survey ultimately endorsed the mission as a possible competitor in the Explorer/Mission of Opportunity program. Because the U.S. contribution was an essential component of their next major mission, the uncertainty engendered by the survey’s endorsement of a competition was not well received by JAXA. Solar-C is currently waiting for a new start.

Informed estimates of cost (i.e., the CATE process) need not be a “show-stopper” for international collaborations. However, attempting to get reliable estimates of mission costs when there is significant international sharing of costs raises a number of complications. If, for example, a major instrument is flown on a foreign spacecraft, the cost may be low enough that it does not meet the threshold for independent costing by the decadal survey.

Suggested Citation:"4 Implementing the Decadal Survey." National Academies of Sciences, Engineering, and Medicine. 2015. The Space Science Decadal Surveys: Lessons Learned and Best Practices. Washington, DC: The National Academies Press. doi: 10.17226/21788.
×

Even with all good intentions, of course, an international collaboration may still fail. Sometimes foreign partners have to withdraw from partnerships. Examples abound: the United Kingdom pulled out of the Solar Dynamics Observatory mission during development and withdrew from the operation of the twin Gemini 8-meter ground-based telescopes after many years of involvement. Similarly, the United States has dropped out of important established collaborations for space missions in the recent past, even when the deal was to its advantage, as was the case in the proposed ExoMars 2016 and 2018 and the Europa Jupiter System Mission collaborations with ESA.

Best Practice: By identifying the essential features and potential challenges of international collaborative missions and projects early on, decadal surveys can recommend processes and procedures for avoiding breakdowns, thus limiting the impact in science and cost on both sides.

Channels of Communication and Coordination

The extent of international participation in a fundamentally U.S.-driven prioritization process will need to be considered as each new decadal survey enters the planning phase. There is no true analog to the Academies’ decadal surveys in other countries. Nevertheless, roadmaps and other scheduling information can be shared via bilateral discussions, forums, and meetings, or by including international members in the survey committees. Pre-decadal survey discussion of science priorities with international partners, and full understanding of the plans of other international agencies, is needed. The intent is to achieve a better alignment and implementation of space programs. Mutual representation on panels and survey committees serve the important purpose of information exchange, but they are not the source of collaborations—these more commonly develop from the bottom up between science teams (e.g., the International X-ray Observatory). International “forums” have also been suggested as a way to communicate the aspirations and plans of different space agencies.24 COSPAR or the International Space Science Institute can potentially play that role.25,26 In Earth science, the Committee on Earth Observing Systems (CEOS) already plays such a role.27 Representation of individuals from other countries on decadal survey committees and panels can be augmented by invited presentations from other space agencies, so as to gain a more institutional perspective on future plans and programs. The intent is to achieve a better alignment and implementation of space programs. Furthermore, after a decadal survey is complete, communication with potential partners can be encouraged, both to seek collaborations and to clarify the decadal survey program.

Best Practice: Decadal studies can use a combination of existing scientific conferences, meetings, and symposia, as well as more targeted dialogues between survey committees and their closest analogs in the scientific advisory apparatus of other countries, to ensure that lines of communication are open.

_______________

24 At the 2012 workshop, J.-P. Swings (former chair of the European Science Foundation’s European Space Science Committee [ESSC]) proposed a global forum for established spacefaring nations and newcomers alike. The forum would generate a sense of global community; it could be organized jointly by the SSB and ESSC, for example.

25 COSPAR, the Committee on Space Research of the International Council for Science, for example, recently developed an international, interdisciplinary roadmap to advance the understanding of space weather. For details, see C.J. Schrijver, K. Kauristie, A.D. Clezio, M. Denardini, S.E. Gibson, A. Glover, et al., Understanding space weather to shield society: A global road map for 2015-2025 commissioned by COSPAR and ILWS, Advances in Space Research 55:2745-2807, 2015. Prior to this, COSPAR completed a roadmap for space astrophysics; see P. Ubertini, N. Gehrels, I. Corbett, P. de Bernardis, M. Machado, M. Griffin, et al., Future of space astronomy: A global road map for the next decades, Advances in Space Research 50:1-55, 2012. More details about COSPAR, its roadmaps, and other activities can be found at the COSPAR website at https://cosparhq.cnes.fr/ (accessed June 22, 2015).

26 The International Space Science Institute (ISSI), headquartered in Bern, Switzerland, seeks to integrate the results of space missions, ground-based observations, and laboratory experiments and add value to those results through multidisciplinary research by teams of international researchers, workshops, working groups, forums, or individual visiting scientists. For more details, see the ISSI website at http://www.issibern.ch/index.html and ISSI-Beijing at http://www.issibj.ac.cn/ (both accessed June 23, 2015).

27 CEOS was established in 1984 under the aegis of the G7 Economic Summit of Industrial Nations Working Group on Growth, Technology, and Employment and serves as the primary forum for the international coordination of space-based Earth observations. More details can be found at http://ceos.org (accessed June 23, 2015).

Suggested Citation:"4 Implementing the Decadal Survey." National Academies of Sciences, Engineering, and Medicine. 2015. The Space Science Decadal Surveys: Lessons Learned and Best Practices. Washington, DC: The National Academies Press. doi: 10.17226/21788.
×

Best Practice: Individual, non-U.S. scientists can be invited to participate in a decadal survey. Participants need to be selected for their scientific backgrounds and expertise, not as institutional representatives, and be cognizant of a broad range of international activities. International representatives that are experienced and senior enough can provide information that will open avenues for collaboration and strengthen channels of communication back to their home space agencies and national space societies and organizations.

Best Practice: Decadal reports can include specific descriptions of the types of international collaboration that the decadal survey committee finds desirable (e.g., cost-sharing, development of instrumentation, coordination of individual missions, or mission architecture).

Best Practice: Decadal reports can explicitly identify any significant programmatic uncertainties and/or craft decision rules that might be required when considering international collaborations. This may be particularly important when international collaborations are a significant component of the survey’s recommended program—in terms of budget or scientific strategy.

INTERAGENCY ISSUES

In its 2011 report about interagency cooperation, the Academies made the strong recommendation that any given space program of the U.S. government be carried out—if at all possible—by a single responsible agency.28 However, it was recognized that, in some instances, a particular agency might not have the mix of experience and technical capability to carry out the requisite programmatic functions to assure space mission success. In these cases, interagency cooperation would be deemed essential. Examples included certain kinds of instrument designs unique to DOE laboratories for NASA Astrophysics missions and NSF-funded facilities (e.g., the Dark Energy Camera for the 4-meter Blanco Telescope at Cerro Tololo Inter-American Observatory and the giant focal plane detector for LSST). Another example is the extensive use of NASA’s capabilities in support of the National Oceanic and Atmospheric Administration’s (NOAA) weather mission.

The committee recognizes that, in some instances, programs and community objectives in the Earth and space sciences must be carried out in an interagency or multi-agency fashion. Although a primary focus of decadal surveys, from the congressional perspective, is to provide guidance for NASA’s research-mission priorities, the implication from a national policy perspective is that decadal surveys will include in their recommended programs crucial science activities that involve DOE, NOAA, NSF, USGS, the Department of Defense, and other federal agencies, as appropriate for each of the space science disciplines.

Yet, it has been clear that not all agencies consider themselves to be involved in or subject to the Academies’ decadal process. That is to say, some agencies have considered the surveys to be aimed primarily at NASA’s flight program, much like the congressional perspective. As such, some agencies have not been inclined to support the planning or funding of the decadal process, nor have they embraced the recommendations made to them as part of the decadal survey plan.

Decadal surveys are exercises in which the relevant scientific communities develop a consensus about what the United States should accomplish over the next 10 years. This is advice given on behalf of the entire community, not just a portion (i.e., the NASA-supported portion) of the discipline. This advice cannot be truly effective if agencies—whose participation is essential for implementation—are not consulted, do not participate, and do not feel the need to respond to a survey’s recommendations. The integrity of the decadal survey process and its utility as a community and national exercise is undermined when agencies that should be involved choose not to be.

Decadal surveys are national in scope and extent. Each is intended to provide the best advice possible from, to, and for an entire scientific discipline. An agency’s involvement in a given disciplinary area implies a strong connection between that agency and its decadal survey process. To fully realize the nation’s investment, any agen-

_______________

28 NRC, Assessment of Impediments to Interagency Collaboration on Space and Earth Science Missions, The National Academies Press, Washington, D.C., 2011.

Suggested Citation:"4 Implementing the Decadal Survey." National Academies of Sciences, Engineering, and Medicine. 2015. The Space Science Decadal Surveys: Lessons Learned and Best Practices. Washington, DC: The National Academies Press. doi: 10.17226/21788.
×

cies that have commerce with a scientific discipline need to include in their own planning the science goals of the discipline’s decadal survey.

Best Practice: Achieving the science goals of a decadal survey and successfully implementing survey recommendations requires that the science program be acknowledged as an interagency, multi-agency activity, one that typically extends beyond the purview of a NASA SMD division.

Best Practice: Participation by all relevant agencies is optimized when decadal reports include specific descriptions of the types of interagency collaboration that the decadal survey committee finds desirable.

Suggested Citation:"4 Implementing the Decadal Survey." National Academies of Sciences, Engineering, and Medicine. 2015. The Space Science Decadal Surveys: Lessons Learned and Best Practices. Washington, DC: The National Academies Press. doi: 10.17226/21788.
×

This page intentionally left blank.

Suggested Citation:"4 Implementing the Decadal Survey." National Academies of Sciences, Engineering, and Medicine. 2015. The Space Science Decadal Surveys: Lessons Learned and Best Practices. Washington, DC: The National Academies Press. doi: 10.17226/21788.
×
Page 74
Suggested Citation:"4 Implementing the Decadal Survey." National Academies of Sciences, Engineering, and Medicine. 2015. The Space Science Decadal Surveys: Lessons Learned and Best Practices. Washington, DC: The National Academies Press. doi: 10.17226/21788.
×
Page 75
Suggested Citation:"4 Implementing the Decadal Survey." National Academies of Sciences, Engineering, and Medicine. 2015. The Space Science Decadal Surveys: Lessons Learned and Best Practices. Washington, DC: The National Academies Press. doi: 10.17226/21788.
×
Page 76
Suggested Citation:"4 Implementing the Decadal Survey." National Academies of Sciences, Engineering, and Medicine. 2015. The Space Science Decadal Surveys: Lessons Learned and Best Practices. Washington, DC: The National Academies Press. doi: 10.17226/21788.
×
Page 77
Suggested Citation:"4 Implementing the Decadal Survey." National Academies of Sciences, Engineering, and Medicine. 2015. The Space Science Decadal Surveys: Lessons Learned and Best Practices. Washington, DC: The National Academies Press. doi: 10.17226/21788.
×
Page 78
Suggested Citation:"4 Implementing the Decadal Survey." National Academies of Sciences, Engineering, and Medicine. 2015. The Space Science Decadal Surveys: Lessons Learned and Best Practices. Washington, DC: The National Academies Press. doi: 10.17226/21788.
×
Page 79
Suggested Citation:"4 Implementing the Decadal Survey." National Academies of Sciences, Engineering, and Medicine. 2015. The Space Science Decadal Surveys: Lessons Learned and Best Practices. Washington, DC: The National Academies Press. doi: 10.17226/21788.
×
Page 80
Suggested Citation:"4 Implementing the Decadal Survey." National Academies of Sciences, Engineering, and Medicine. 2015. The Space Science Decadal Surveys: Lessons Learned and Best Practices. Washington, DC: The National Academies Press. doi: 10.17226/21788.
×
Page 81
Suggested Citation:"4 Implementing the Decadal Survey." National Academies of Sciences, Engineering, and Medicine. 2015. The Space Science Decadal Surveys: Lessons Learned and Best Practices. Washington, DC: The National Academies Press. doi: 10.17226/21788.
×
Page 82
Suggested Citation:"4 Implementing the Decadal Survey." National Academies of Sciences, Engineering, and Medicine. 2015. The Space Science Decadal Surveys: Lessons Learned and Best Practices. Washington, DC: The National Academies Press. doi: 10.17226/21788.
×
Page 83
Suggested Citation:"4 Implementing the Decadal Survey." National Academies of Sciences, Engineering, and Medicine. 2015. The Space Science Decadal Surveys: Lessons Learned and Best Practices. Washington, DC: The National Academies Press. doi: 10.17226/21788.
×
Page 84
Suggested Citation:"4 Implementing the Decadal Survey." National Academies of Sciences, Engineering, and Medicine. 2015. The Space Science Decadal Surveys: Lessons Learned and Best Practices. Washington, DC: The National Academies Press. doi: 10.17226/21788.
×
Page 85
Suggested Citation:"4 Implementing the Decadal Survey." National Academies of Sciences, Engineering, and Medicine. 2015. The Space Science Decadal Surveys: Lessons Learned and Best Practices. Washington, DC: The National Academies Press. doi: 10.17226/21788.
×
Page 86
Suggested Citation:"4 Implementing the Decadal Survey." National Academies of Sciences, Engineering, and Medicine. 2015. The Space Science Decadal Surveys: Lessons Learned and Best Practices. Washington, DC: The National Academies Press. doi: 10.17226/21788.
×
Page 87
Suggested Citation:"4 Implementing the Decadal Survey." National Academies of Sciences, Engineering, and Medicine. 2015. The Space Science Decadal Surveys: Lessons Learned and Best Practices. Washington, DC: The National Academies Press. doi: 10.17226/21788.
×
Page 88
Suggested Citation:"4 Implementing the Decadal Survey." National Academies of Sciences, Engineering, and Medicine. 2015. The Space Science Decadal Surveys: Lessons Learned and Best Practices. Washington, DC: The National Academies Press. doi: 10.17226/21788.
×
Page 89
Suggested Citation:"4 Implementing the Decadal Survey." National Academies of Sciences, Engineering, and Medicine. 2015. The Space Science Decadal Surveys: Lessons Learned and Best Practices. Washington, DC: The National Academies Press. doi: 10.17226/21788.
×
Page 90
Next: Appendixes »
The Space Science Decadal Surveys: Lessons Learned and Best Practices Get This Book
×
 The Space Science Decadal Surveys: Lessons Learned and Best Practices
Buy Paperback | $59.00 Buy Ebook | $47.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The National Research Council has conducted 11 decadal surveys in the Earth and space sciences since 1964 and released the latest four surveys in the past 8 years. The decadal surveys are notable in their ability to sample thoroughly the research interest, aspirations, and needs of a scientific community. Through a rigorous process, a primary survey committee and thematic panels of community members construct a prioritized program of science goals and objectives and define an executable strategy for achieving them. These reports play a critical role in defining the nation's agenda in that science area for the following 10 years, and often beyond.

The Space Science Decadal Surveys considers the lessons learned from previous surveys and presents options for possible changes and improvements to the process, including the statement of task, advanced preparation, organization, and execution. This report discusses valuable aspects of decadal surveys that could taken further, as well as some challenges future surveys are likely to face in searching for the richest areas of scientific endeavor, seeking community consensus of where to go next, and planning how to get there. The Space Science Decadal Surveys describes aspects in the decadal survey prioritization process, including balance in the science program and across the discipline; balance between the needs of current researchers and the development of the future workforce; and balance in mission scale - smaller, competed programs versus large strategic missions.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!