4
Assessment of the Portfolio Review Process
The ICCGS report1 represents a significant effort to meet the challenge of developing recommendations that will enable the GS portfolio to address priorities of the 2013 solar and space physics decadal survey2 and address community needs within the constraints described in Chapter 2. Since a substantial fraction of GS resources are allocated to facilities, this component of the GS portfolio received particularly close scrutiny in the ICCGS. As part of assessing how well the program recommended by the ICCGS report aligns with survey priorities and with the science needs of the community, the assessment committee examined elements of the ICCGS’s review process, particularly the information gathered on which its deliberations were based, and the methodology used to assess critical capabilities needed going forward.3
4.1 INFORMATION GATHERED BY THE PORTFOLIO REVIEW COMMITTEE
Information was requested from a wide range of stakeholders in the ICCGS report process. First, requests for information (RFIs) were sent to GS program officers to help the PRC understand all elements of the GS portfolio. There was significant turnover in GS staff; the ICCGS noted that during “the roughly nine months of the review, the PRC interacted with three different Section Heads, two different facility [program managers] PMs, two different AER [aeronomy] PMs, and the three standing PMs for the MAG, STR and SWR programs [magnetospheric physics, solar-terrestrial science, and space weather research]” (ICCGS Section 9.6). This represented an additional challenge to the PRC and to GS.
For facilities, targeted RFIs were sent to the six GS Class 1 facility4 principal investigators (PIs), to the four GS Class 2 facility PIs, and to the CRRL PI.5 Facility PIs were then interviewed by telephone.
___________________
1 National Science Foundation (NSF), 2016, Investments in Critical Capabilities for Geospace Science 2016 to 2025, Geospace Section of the Division of Atmospheric and Geospace Science, February 5, https://www.nsf.gov/geo/adgeo/geospace-review/geospace-portfolio-reviewfinal-rpt-2016.pdf.
2 National Research Council (NRC), 2013, Solar and Space Physics: A Science for a Technological Society, The National Academies Press, Washington, D.C.
3 The Portfolio Review Committee’s review process, community input, and guiding principles are described in ICCGS Chapter 2 (NSF, 2016); the methodology for establishing critical capabilities is provided in ICCGS Chapter 5 (NSF, 2016).
4 Definitions of Class 1 and Class 2 facilities are provided in ICCGS Chapter 7.2.3 (NSF, 2016).
5 The RFIs sent to the facility PIs are reproduced in Appendix C of the ICCGS report.
Cognizant of interfaces to programs and facilities within NSF and external to NSF, the PRC reached out to leadership of several of these programs. Interviews of the NCAR/HAO director, the National Solar Observatory (NSO) director, the EISCAT scientific association director, and the acting head of NSF-GEO Polar Programs (PLR) were conducted.
The PRC solicited information from the wider geospace science community by requesting that input be submitted to an e-mail address that was set up for this purpose. Forty-seven responses were received. Furthermore, town hall meetings were convened at each of the CEDAR, GEM, and SHINE meetings during the summer of 2015 to garner viewpoints of community members.
The assessment committee acknowledges the lengths to which the PRC went to gather information from the relevant stakeholders. The committee had concerns, however, regarding the transparency of the information gathered, particularly that related to GS facilities. The PRC was internal to NSF, not a public advisory committee subject to the terms of the Federal Advisory Committee Act.6 Public access to data gathered by the PRC was not required, nor was NSF able to make the data generally available to the assessment committee. Other than a short description in ICCGS Section 7.3.1, evidence used by the PRC to determine the comparative productivity and scientific utility of the facilities was not provided explicitly in the ICCGS report or in an appendix. This made it difficult for the assessment committee—and by extension the community—to fully understand the basis for ICCGS findings and recommendations regarding facilities.
In conversations and communications with the PRC chair7 and other PRC members, the assessment committee learned that a great deal of quantitative data and qualitative information was collected from the facility PIs and NSF, including background information on the technical capabilities of facilities as well as additional facility productivity metrics. Additional targeted information was requested via an RFI sent to each facility or was received as written follow-up in response to phone interviews with facility PIs. NSF provided annual facility reports as well as copies of proposal documents for recent awards. These were reviewed by the PRC for consistency with other information received from facility PIs, GS staff, and community inputs. As a result, the ICCGS had a substantial array of information, data, and metrics upon which to base its evaluation. For facilities, this included the following:
- Hours of operation of the facility per annum in recent years;
- Publications from each facility for at least 5 years;
- Number of facility users and data users;
- Current state of maintenance on all facilities;
- Future science and technology plans for each facility, including some costs;
- Various sources of funding for a facility;
- International agreements with which the facilities were involved; and
- Present and future plans of the facilities in support of the survey.
Finding: The PRC collected substantial amounts of information and data about each GS facility in order to perform its comparative assessment. Little of this information and data were presented in the ICCGS report.
The PRC learned that although each facility routinely collects considerable performance data, the methodology is not consistent between facilities. ICCGS Rec. 7.36 states that GS should develop a standard reporting format that allows facilities to report a common set of annual performance data and metrics to NSF in the same way. The assessment committee endorses this recommendation but cautions that the administration and utilization of data and metrics from facilities and other GS programs may require additional resources and expertise in informatics that is not currently available to GS. See Chapter 6 for further discussion.
___________________
6 General Services Administration, “The Federal Advisory Committee Act,” last reviewed April 12, 2016, http://www.gsa.gov/portal/content/100916.
7 Personal communications via email from William Lotko, Dartmouth College, Portfolio Review Committee chair to Timothy Bastian, committee chair, August 3, 2016.
Conclusion: GS has not had a standard set of performance metrics by which it uniformly evaluates facilities. The assessment committee endorses the ICCGS’s recommendation to GS to develop a common set of annual metrics from each facility.
4.2 THE PORTFOLIO REVIEW ALIGNMENT WITH SURVEY PRIORITIES
The ICCGS response to decadal survey priorities in reviewing the GS program was highly constrained by the scope of the review and the budget available. Survey priorities for NSF are summarized in Section 1.3. The ICCGS considered the survey’s baseline priority of “completing the current program” as it applied to the existing suite of GS facilities. In considering the recommendations of the DRIVE initiative—in terms of freeing resources and evolving the GS program forward to address future facilities, programs, and activities within the context of systems science—the ICCGS turned to the science goals and science challenges identified in the survey to identify critical capabilities needed.
Chapter 5 of the ICCGS report presents a thorough summary of the survey science program. The key science goals presented in the survey flow down to science challenges identified for atmospheric-ionosphere-magnetosphere interactions, solar wind-magnetosphere-ionosphere interactions, and solar and heliospheric physics.8 In addition, the overarching objective for Space Weather and Prediction (SWP) is summarized in ICCGS Section 5.4, as are critical capabilities to support the objective. The ICCGS maps survey science challenges and SWP challenges to required capabilities (observational, theory and modeling, and data exploitation) and recommended GS investments—in both current and future GS programs and facilities.9 The ICCGS also identifies external capabilities and partners of potential interest to GS within NSF-AGS, across NSF, at other U.S. agencies, and internationally. This level of detail goes well beyond the top-level survey recommendations and demonstrates that the ICCGS paid careful attention to the implications of specific survey science goals.
However, while obviously informed by the survey science program, the ICCGS report does not describe how the particular recommended investments provide the capabilities required to address the survey science goals. For example, the report did not assess the contribution of each GS facility to understanding the science goals or explain how the relative facility priorities were established. The ICCGS describes program and facility priorities in ICCGS Chapter 9 (see also ICCGS Table 9.1), but there is no definition of the prioritization process given nor the criteria used to establish the priorities.
Finding: The ICCGS report does not explain how the recommended investments in particular programs and facilities satisfy the required capabilities to address the decadal survey science goals. The process used for establishing the relative priorities between facilities and for program elements is not defined in the report.
4.3 CONCLUSIONS REGARDING THE PORTFOLIO REVIEW PROCESS
The assessment committee has identified two areas of concern regarding the ICCGS process. First, little of the extensive amount of information, data, and metrics gathered from GS facilities is presented in the report. This made it difficult for the assessment committee—and by extension the community—to understand the basis for findings and recommendations made in the ICCGS report. Second, the ICCGS does not explain how the recommended investments in facilities meet the critical capabilities needed to address decadal survey goals and how relative priorities between facilities and for program elements were established. This contributes to a perceived lack of transparency that may undermine the community’s confidence in the deliberations that underpinned the recommendations made to GS.
The assessment committee had the benefit of discussing the portfolio review process and the ICCGS report with the PRC chair and members of the PRC during the course of its assessment. The assessment committee is reassured that considerable facility data and metrics were reviewed in a conscientious and comprehensive fashion
___________________
8 The survey science goals and challenges are summarized in NSF, 2016, Table 5.1.
9 NSF, 2016, Investments in Critical Capabilities for Geospace Science 2016 to 2025, Table 5.2.
and, while alternative recommendations could have emerged from the process, the PRC has fulfilled its charge within the extremely challenging constraints discussed in Chapter 2. The ICCGS did not, and in some instances could not, fully address all survey priorities. These priorities will be identified and discussed in Chapter 5.
Conclusion: The PRC fulfilled its charge within the imposed constraints. The portfolio review process and the resulting report represent a conscientious, thorough, and good-faith effort to review the NSF GS portfolio and make recommendations for portfolio evolution and renewal.
The GS program recommended by the ICCGS is discussed in the Chapter 5. Both the assessment committee and the PRC understand that ICCGS recommendations will have both positive and, in some cases, negative effects on the geospace sciences community. The community must therefore understand the basis for the specific recommendations made by the ICCGS and have confidence in the process that led to them. It falls to GS to reach out to the community to explain the recommended program, explain the basis for the specific recommendations, and to express their confidence in the program as they move forward with its implementation.
Recommendation: The National Science Foundation Geospace Section should reach out to the geospace sciences community to explain the program recommended by Investments in Critical Capabilities for Geospace Science 2016 to 2025 and its basis and keep the community informed regarding plans to implement the recommended program.