NSF's Division of Science Resources Studies is a federal statistical agency that exists to serve the information needs of policymakers, managers, educators, and researchers in the science and engineering community. The division's mandate is "to provide a central clearinghouse for the collection, interpretation, and analysis of data on scientific and engineering resources, and to provide a source of information for policy formulation by other agencies of the Federal government" as required by the National Science Foundation Act of 1950, as amended.
To be effective in this role, SRS must ensure the ongoing relevance of the information it provides through its portfolio of data collection and analysis activities. In this chapter we summarize our recommendations for operational changes that will facilitate an ongoing renewal of the concepts that SRS data should measure and the analysis the division provides. We also summarize the issues in science and engineering resources that we recommend SRS should better address through its data collection and analysis activities. We have assigned our highest priority to our first two recommendations that address how SRS may strengthen its dialogue and interactions with both its stakeholders and external researchers. These are followed by recommendations that provide an immediate agenda for the strengthened dialogue and interaction.
Ensuring Relevance and Establishing Priorities
Relevance of data has many dimensions and each of these should be considered in determining whether a statistical agency is meeting the needs of its constituents. (NRC 1997a) For purposes of this study, we have focused on three factors that affect the relevance and analytic value of data: appropriateness of concepts and their measurement; ability to link data sets; and data currency. Here, we examine operational aspects of a federal statistical agency that affect these dimensions of relevance and the quality of the data and information SRS provides to its constituents.
Appropriateness of Concepts and their Measurement
Recommendation 1. To keep its data relevant and maintain data quality and analytic capacity, SRS should adopt a strategy of continuous review and renewal of the concepts it seeks to measure, and revise its survey instruments, survey operations, and data analysis as needed to keep them current. To achieve this, SRS must strengthen the frequency and intensity of its dialogue and interactions with data users, policymakers, and academic researchers and develop internal processes to convert the feedback it receives from these stakeholders into changes in its surveys and analyses. A key element of this strategy is the creation of advisory committees for SRS surveys that would assist SRS in establishing priorities for future change.
To expand the range of surveys that benefit from advisory committees, we strongly recommend the creation of such a committee for the Survey of Industrial Research and Development. We also recommend that the existing Special Emphasis Panel (i.e., advisory committee) for the Doctorate Data Project (DDP) advise SRS on the Survey of Earned Doctorates (SED), the content of all three SRS personnel surveys, and the design of the Scientists and Engineers Statistical (SESTAT) system. This panel already provides SRS advice on the SED and the Survey of Doctorate Recipients (SDR), and should also provide advice on the National Survey of College Graduates (NSCG) and the National Survey of Recent College Graduates (NSRCG).
A statistical agency should work with data users to define the concepts that it will measure to meet users' information needs. These concepts and how to measure them should be continuously reviewed and revised as issues change and as analysis reveals alternate measures that better capture information that is useful to constituents. Special attention should be paid to whether new concepts can be quantified in a meaningful way, whether data may be reliably collected on the subject, the level of detail that meets user needs, and the cost-effectiveness of data collection.
To operationalize this process of ongoing review and renewal of data concepts, SRS must increase the frequency and intensity of its dialogue and interactions with data users, policymakers, and academic researchers to capitalize on their insights, expertise, and analytic capabilities. To generate direct interaction of this sort, SRS should establish an advisory committee for each of its surveys. These committees can assist SRS in keeping survey content up-to-date and also in establishing priorities for future change.
SRS already has such committees (also referred to as "special emphasis panels") for many of its surveys. We urge that an advisory committee be instituted for the Survey of Industrial R&D given changes now occurring in industrial R&D. We also urge that the Special Emphasis Panel for the Doctorate Data Project (DDP) be engaged for advice on the Survey of Earned Doctorates and on all three personnel surveys in the SESTAT system. The 1989 NRC Report, Surveying the Nation's Scientists and Engineers, recommended that an advisory committee be established for the SESTAT system to review its design and content for the next decade. We believe that expanding the scope of the DDP panel is the most efficient means for accomplishing this.
The SRS Breakout Group of the Directorate for Social, Behavioral, and Economic Sciences (SBE) Advisory Committee that provides NSF with advice on SRS surveys and operations and the advisory committees for individual surveys should work together to review and assist in the implementation of our recommendations, and set priorities among them and other proposals for change. To facilitate this interaction, the SRS Breakout Group of the SBE Advisory
Committee should include individuals who are members of advisory committees for specific surveys. By having members of individual advisory committees on the SRS Breakout Group of the SBE Advisory Committee, individual survey advisory committees may contribute to the process of ensuring the overall relevance of the SRS portfolio and establishing priorities for change among surveys.
SRS may also generate dialogue and interaction through other means. These include holding an ongoing series of workshops on issues emerging in the science and engineering enterprise, and improving outreach with constituent groups through booths at conferences and similar activities. SRS should also engage in more purposeful dissemination of publications and enhanced customer service as means for promoting interaction with data users. Finally, SRS should continue to field a periodic customer survey.
SRS should convert the feedback it receives from these stakeholders into means for producing the information its constituents seek. SRS may revise survey instruments or add special modules to instruments to collect data for one survey cycle. In the past SRS has had a quick response panel for issues in industrial R&D. Reinstituting such panels would provide another means for collecting information, particularly on issues that may be currently important, but which do not necessarily signify structural change. Finally, SRS may employ or sponsor qualitative research as a complement to periodic surveys in order to obtain information more rapidly or comprehensively on poorly understood issues.
Recommendation 2. SRS should more actively engage outside researchers in the analysis of SRS data on current science and engineering resources issues. This may be accomplished by allowing researchers to work at SRS as visiting fellows and by establishing an external grants program. SRS should also monitor and summarize research using its data.
Federal statistical agencies analyze their own data to provide substantive analysis of specific subject area issues and also to understand better the uses and limitations of the agency's data. SRS should carefully consider how it may best engage in and support research on science and engineering resources in the future. It should continue its in-house analytical activities, such as those in the Integrated Studies Program, and also actively engage outside experts in analytical activities using SRS data.
Because the division's staff is limited in size, it cannot have expertise in the full range of subject areas upon which it may be called for data and analysis. SRS and its constituents would therefore benefit from a more interactive relationship between SRS and external researchers who focus on science and engineering resources issues. Programs to bring external researchers into SRS through a visiting fellows program or to provide grants to researchers who utilize SRS data would expand the analytical range of the division and promote data use. These programs would be especially valuable if targeted toward data sets such as SESTAT that are underutilized, especially relative to their cost of production. They would also increase interaction between SRS and external data users, producing insights that would benefit SRS staff, researchers, and the relevance and quality of the data.
Also, SRS should monitor and summarize research conducted by others using SRS data, especially data from the NSCG, NSRCG, and the SDR, which are currently underutilized. This could be assigned to a contractor responsible for research. Other federal statistical agencies provide similar summaries of research based on data of use to social scientists. If SRS were to do this, too, such
summaries would be an aid to researchers as well as a source of information for SRS in its role of advising policymakers.
Recommendation 3. SRS and the National Science Board (NSB) should develop a long-term plan for Science and Engineering Indicators so that it is smaller, more policy focused, and less duplicative of other SRS publications to free SRS resources for other analytical activities.
NSB and SRS should develop a long-term plan for restructuring Science and Engineering Indicators. Individuals interviewed for this study as well as science and technology policymakers recently interviewed by SRS following the publication of the last Indicators volume suggest at least three possible futures for Indicators. The first of these, of course, is maintaining the status quo. As currently conceived, Indicators provides a wealth of information on science and engineering resources in the United States, and increasingly, in an international framework. Both the NSB and SRS benefit from this arrangement: it provides the NSB a means for highlighting important science and engineering issues; it allows SRS to showcase its data in a high-profile report considered by many an essential reference for quantitative information by science and technology policymakers. The second is for the NSB to reduce the amount of policy analysis in the volume and concentrate on the data presented. Those with this perspective believe that the policy analysis presently in Indicators is not very useful, while the data are. The third is for the NSB to make the document more focused on policy issues and less on data. Individuals holding this point of view suggest that Indicators would have a greater impact if it were smaller, less redundant with other SRS publications, and offered policy insights built on important indicators.
We believe that Science and Engineering Indicators should be smaller and more policy-focused. Indicators would have more impact on science and technology policy if it focused on bringing analysis to a small set of indicators on issues driving the future of the science and engineering enterprise. There should be a sharper division between the work of a policymaking body such as the National Science Board and the work of a federal statistical agency such as SRS. Indicators is redundant with other publications of SRS data which could be referenced in Indicators and also linked via hypertext when published on the Internet. Substantial SRS resources—especially staff resources—which are now devoted to the production of this volume, would be freed for other analytic activities if the report were refocused.
Data Comparability and Linkages
Recommendation 4. SRS should increase the analytic value of its data by improving comparability and linkages among its data sets, and between its data and data from other sources. Standardizing its science and engineering field taxonomies and other questions across survey instruments is a critical step in this process. Resolving discrepancies in results from different surveys is another.
The ability to link data collected through various instruments increases the breadth and depth of data, and thereby, the ability of analysts to use them to address current issues. SRS's portfolio of data collection activities has been established over the past half century as a number of individual surveys that provide information on specific pieces of the science and engineering enterprise. SRS has recently begun to manage its surveys as components of a more integrated data system. For example, SRS created SESTAT in the early 1990s in response to the 1989 NRC report, Surveying the Nation's Scientists and Engineers, that called for a more integrated science and engineering personnel data system. SRS has also created WebCASPAR, an integrated data source on higher education institutions.
To increase the analytic power of its data, SRS should find ways to integrate its data sets beyond SESTAT and WebCASPAR. First, SRS should continue to improve comparability in questions and response categories across surveys. Its surveys often ask similar questions, yet in different ways. This is clearly evident with regard to questions on field of science and engineering. SRS should develop a standardized science and engineering field taxonomy for its surveys, apply it to each survey instrument, and then keep it up-to-date on an ongoing basis. Second, SRS should continue to investigate discrepancies in survey results among its R&D funding and performance surveys and implement changes in survey instruments and operations to address and resolve them. The division should also consider establishing a committee to develop a design for a new, integrated R&D data system. Such a committee might be charged with devising a system analogous to the integrated SESTAT personnel data system that would also account for the apparent increase in the number of intra- and inter-sectoral R&D partnerships. Third, SRS must develop means for linking its R&D investment data and its human resources data.
SRS must, of course, follow appropriate guidelines for maintaining confidentiality of records as it engages in or facilitates the process of linking data sets internally and externally. However, within the bounds of legal requirements, SRS should make every effort to make its data sets available to researchers, including those who seek to link them to other data. SRS may look to other federal agencies, such as the Census Bureau's Center for Economic Research, as a model for achieving this.
SRS should also help coordinate the data gathering activities of others to improve data availability and comparability with its own data. For example, SRS should encourage standardization in data collection by professional associations and universities on the early job market and career experiences of new Ph.D.s. SRS should also continue to play a leading role in collecting, coordinating, and standardizing international science and engineering resources data.
Finally, SRS should seek effective ways to allow researchers to link its data to those from other data sources, public and private. For example, SRS could seek ways to link its graduate education data with data collected by the Educational Testing Service (e.g., GRE scores). SRS could also explore the ability to link its personnel data to other federal data on fellowships, traineeships, and research awards or to public and private data on patents and publications. SRS should also add an indicator for metropolitan statistical area to each of its R&D funding and performance surveys. This would allow aggregations of data for each survey by metropolitan area and would open these surveys to linkages with other economic, education, and demographic data collected at the metropolitan level.
Recommendation 5. SRS must substantially reduce the period of time between the reference date and data release date for each of its surveys to improve the relevance and usefulness of its data.
The currency of data depends on the periodicity of each survey and the timeliness with which data are released. Data that are collected biennially, for example, are expected to have a ''shelf life" roughly twice that of annually-collected data and their relevance depends on their ability to be "current" during that shelf life. This currency, in turn, depends on the timeliness with which collected data are released for public use. Timeliness is measured, in the case of SRS, by the time that elapses between the reference date in each survey and the date at which survey data are released. The timeliness and currency of SRS data have been discussed by data users for
some time. To improve the currency of its data, SRS must continue its recent efforts to substantially reduce the period of time between the reference date and data release date for each of its surveys. Means for accomplishing this goal include using incentives for timely response, increased use of the Internet for data collection, and early release of key indicators.
SRS as a Statistical Agency
Recommendation 6. SRS should be seen as a federal statistical agency and should be supported in its efforts to meet fully those standards set for federal statistical agencies for independence, professional staffing, data quality, and data analysis.
Recommendation 7. SRS's budget is substantially smaller than those of other federal statistical agencies and may need to be increased given the growing importance of its subject area and our recommendations for new processes, data collection activities, and additional studies. Any budgetary increase must be based on clearer information from SRS on its allocation of internal staff and financial resources across its surveys and other activities and on a clearer sense of priorities among current and future surveys and activities, as developed in coordination with advisory committees for its individual surveys and the SRS Working Advisory Group of the SBE Advisory Committee.
NSF should see SRS as a federal statistical agency. While SRS is small compared with other federal statistical agencies, its staff of about forty are called on to carry out each of the major functions of a federal statistical agency: data collection and acquisition, quality assurance, preparation of tabulations and public use data files, data analysis, publication of reports, and data and report dissemination.
NSF should support SRS in its efforts to meet fully those standards set for federal statistical agencies regarding independence, professional staffing, data quality, and data analysis. SRS should develop a staffing plan that allows it to improve staff skills and augment staff expertise, especially in key areas. First, SRS should continue to improve statistical and analytical skills among its staff through professional development activities conducted in-house or through the many courses offered in survey and statistical methods in the Washington, D.C. area. Second, SRS should augment its range of staff expertise in relevant subject areas through new hires. The number of full-time equivalent (FTE) positions in SRS was reduced earlier in the 1990s and has remained constant since. We recommend that NSF allow the number of FTEs allocated to SRS to grow so that the division may broaden the range of staff expertise through hires for new positions as well as through staff turnover.
SRS's budget is substantially smaller than those of other federal statistical agencies and may need to be increased given the growing importance of its subject area and our recommendations for new processes, data collection activities, and additional studies. We did not have access to sufficiently detailed budget data to conduct a cost-benefit analysis either of the items we recommend or of existing components of the SRS portfolio. Thus, we have not been able to prioritize all of our recommendations, nor have we been able to suggest trade-offs between new activities and existing ones. Any budgetary increase must be based on clearer information from SRS on its allocation of internal staff and financial resources across its surveys and other activities, and on a clearer sense of priorities among current and future surveys and activities as developed in coordination with advisory committees for its individual surveys and the SBE Advisory Committee's SRS Breakout Group.
Improving Data Relevance
Science and engineering, a $247 billion enterprise in the United States, plays a central role in the advancement of our knowledge-intensive economy and it affects the daily lives of Americans in myriad ways. 1 The funding, organization, and conduct of science and engineering continue to evolve even as they contribute to economic and social change. To keep its data relevant for answering today's questions on science and engineering resources, SRS needs to keep its data collection portfolio current with these changes. SRS should investigate under-addressed issues in graduate education, the labor market for scientists and engineers, and research and development funding and performance. SRS should use the results of these investigations to revise survey instruments and issues for analysis.
Graduate School and the Transition to Employment
Recommendation 8. SRS should revise its data collection on issues in graduate education and the job market for new Ph.D.s to better address issues on financial support for graduate students, completion of graduate school, and the transition of new Ph.D.s to employment. SRS must carefully study whether fielding a new longitudinal survey of beginning graduate students, now under consideration, is feasible and cost-effective before committing to such a survey. However, the division should revise the Survey of Earned Doctorates (SED) to include questions on progress toward degree completion and job market experiences, and it should seek to assist professional societies and universities in the collection of standardized data on the job market for new Ph.D.s.
In the face of a difficult job market for recent science Ph.D.s in some fields in the 1990s, policymakers, educators, and analysts have expressed concerns about the efficacy of certain types of support for graduate students and about the outcomes of graduate education. To better understand these issues, they have expressed a desire for additional data on graduate school completion and attrition, career expectations, educational experiences and skills acquired, packages of student financial support, and the effect of each of these on graduate school and career outcomes.
We recommend that SRS analyze and quickly disseminate the retrospective data it collected through the 1997 SDR on the graduate school and job market experience of Ph.D.s who received their degrees between June 1990 and June 1996. These data should address some of the concerns of policymakers, educators, and analysts.
SRS is currently in the development stage for a new longitudinal survey of beginning graduate students, designed to obtain data on education and job market experiences of graduate students. SRS should carefully consider the feasibility and cost-effectiveness of developing and administering such a survey. Based on our current understanding, we question the wisdom of such a survey. SRS should investigate the current state of research on the graduate school experience, attrition and completion, graduate student financial support, and graduate school outcomes. The division should commission additional studies on these subjects as needed to supplement existing research. SRS should then weigh whether the issues warrant ongoing national data collection from graduate students, examine the ability to collect high-quality national data on attrition and packages of financial support, investigate sampling options, determine the cost-effectiveness of conducting a survey that would gather these data, and consider
alternative sources of data. If fielded, the survey would ideally be longitudinal in nature. However, SRS has not fully exploited the longitudinal nature of its other surveys and resources should be committed to a longitudinal survey only if SRS intends to support and utilize it as such.
We do not recommend that such a survey, if fielded, or the Survey of Earned Doctorates be used to collect data on "skills" obtained by graduate students. We do, however, recommend that SRS revise the Survey of Earned Doctorates to obtain data on progress through graduate school, perhaps by adding a question asking respondents for the date when all Ph.D. requirements except for the dissertation were completed.
Transition to Employment
A serious gap in SRS data on science and engineering Ph.D.s has been the job market experience of Ph.D.s in the twelve months before and after receipt of the degree. SRS added retrospective questions on this subject, too, to a one-time module in the 1997 SDR. The division should analyze and disseminate these data, but SRS should also institute ongoing collection of these data by adding questions to the Survey of Earned Doctorates about the job market experience of Ph.D.s prior to degree receipt, and about the salaries of new Ph.D.s who have firm commitments for employment at the time the degree is received.
The division should also continue to augment its own data collection and dissemination by assisting others collecting data in this area—particularly professional societies that survey their members who are recent Ph.D.s and colleges and universities that track the career outcomes of recent science and engineering alumni. The Association of American Universities recently urged research universities to collect data on degree completion by students and on job placement for alumni. If data collected by these institutions were standardized they could be productively aggregated at the national level.
The Labor Market for Scientists and Engineers
Recommendation 9. SRS should revise its data collection on the labor market for scientists and engineers to better capture the career paths of scientists and engineers. SRS should fill gaps in existing data on careers by collecting comparative data on the careers of humanities doctorates, and data on the nonacademic careers of scientists and engineers, on science and engineering field of work, and on the international mobility of scientists and engineers. The division should also work with the Special Emphasis Panel for the Doctorate Data Project to address content and design issues for the SESTAT system to be implemented in the next decade.
Career Paths of Scientists and Engineers
To improve its data on the labor market for scientists and engineers, SRS should refine or augment several aspects of its personnel surveys to better capture the career paths of scientists and engineers. First, the division should exploit the longitudinal nature of its personnel surveys, which were obtained at great expense and with a respondent burden that is difficult to justify if the data are not used longitudinally. Second, SRS could provide better career path data by making it available at a more detailed level by field. SRS should consider the options available for allowing fine field analysis (for degree field) that is currently obstructed by the small sample size. Increasing sample size is potentially costly, but other options may present themselves. On a related note, science and engineering field of work, dropped from the SDR in 1993 and replaced by a question on occupation, should be restored to the
questionnaire. Third, SRS should explore opportunities for linking its personnel data to other career and productivity data, such as data sets of federal research grants, patents, and publications. Fourth, the hole left by the demise of the humanities component of the Survey of Doctorate Recipients seriously undermines our ability to analyze the Ph.D. labor market. SRS must work with the National Endowment for the Humanities (NEH) and other funding sources, if necessary, to reinstate this SDR component.
To better understand the career paths of scientists and engineers and the career options of new Ph.D.s, SRS should revise the SDR to obtain data that better describe the careers of Ph.D.s who work for government agencies, private businesses, and nonprofit organizations. Questions that might be added to better capture nonacademic careers include questions on non-salary compensation; patenting and other productivity measures in the private sector; use of scientific background in sales, regulation, or patent law positions; and temporary work arrangements like contracting and consulting. This is not an exhaustive, but rather an illustrative list.
As the decade comes to a close, we also strongly recommend that SRS work with the Special Emphasis Panel for the Doctorate Data Project to address content and design issues for the SESTAT system for the next decade. The 1989 NRC Report Surveying the Nation's Scientists and Engineers recommended that an advisory committee review SESTAT at this time.
International Flows of Scientists and Engineers
Given the globalization of the science and engineering labor market, SRS should develop a long-range plan for improving and increasing the data it collects or acquires on the international flows of scientists and engineers. SRS should begin with an effort to improve data on foreign scientists and engineers at all levels in the United States—students, postdoctorates, and employees. SRS should also examine the costs and benefits of including foreign-educated scientists and engineers working in the United States in the sampling frames for the personnel surveys.
R&D Funding and Performance
Recommendation 10. SRS should revise the data it collects on R&D funding, performance, outputs, and outcomes to improve comparability across surveys and to address structural changes in the science and engineering enterprise. SRS should begin by addressing structural changes in industrial research and development, the relationship between R&D and innovation, the apparent increase in intra- and inter-sectoral partnerships and alliances, and claims that interdisciplinary research is increasing. SRS should examine the costs and benefits of administering the Survey of Industrial Research and Development at the line of business level. SRS should also revise its surveys to address new concepts (e.g., the federal science and technology budget), discrepancies in results among R&D surveys, and the need to obtain better data on academic R&D facility costs.
Industrial R&D Statistics
SRS should improve the accuracy of detailed data on industrial R&D. Currently the Survey of Industrial Research and Development (RD-1), which is fielded at the firm level, attempts to disaggregate both applied research and development by asking respondents to distribute these by product group. Firms, however, often ignore this question and the low response rate to product group has made the collected data of little use. We recommend eliminating the product group question from RD-1. As an alternate strategy for obtaining finer detail on industrial R&D,
SRS should examine the costs and benefits of administering RD-1 to business units instead of firms. Currently all R&D conducted by a firm is attributed to the firm's predominant industrial category. In an economy dominated by large, multi-product firms, line of business reporting, if feasible, may improve data by obtaining finer detail by industrial classification and geographic location.
R&D and Innovation
Current R&D expenditure data do not provide adequate information on many activities contributing to innovation. These activities may include hiring personnel or consultants with new skills, contracting with specialized firms, training existing staff, or reorganizing business processes. SRS should pursue plans to develop a survey of industrial innovation that addresses these and other issues regarding the manner in which science and technology are transferred among firms and transformed into new processes and products. SRS should include both potential respondents and data users in the development of the survey instrument.
SRS should also conduct or sponsor research on the nature of R&D in the service sector. The service sector has increased its share of national R&D investment significantly from less than 5 percent in the early 1980s to almost 25 percent today. SRS should seek to better understand the processes and outcomes of service sector R&D in order to determine the kinds of changes that should be implemented in the Survey of Industrial Research and other surveys to capture important characteristics of R&D in this sector. As a supplement to this investigation, SRS should also examine how personnel data may be used to examine trends in research utilization and innovation.
Partnerships and Alliances
Recent trends suggest that the organizational structure of research and development now includes a web of partnerships and subcontracts among firms, universities, and federal agencies and labs. Yet the extent of such partnerships and their value is not fully understood, in part because of limitations in SRS data. SRS should investigate the nature and variety of these strategic alliances in R&D, including the role of partnerships, outsourcing, mergers and acquisitions, investments in allied firms, and cross-sectoral consortia in performing and supplying R&D. Results of these investigations should guide SRS in revising its survey questionnaires, as necessary, to obtain more complete detail on the role of these partnerships in the science and engineering enterprise. For example, anecdote has suggested that collecting data on inter-sectoral partnerships through the Survey of Industrial R&D may be difficult. A study might explore whether such data could instead be obtained through the Survey of R&D Expenditures in Universities and Colleges and through the Survey of Doctorate Recipients.
Similarly, it has become almost a cliché to say that the amount of interdisciplinary and multidisciplinary research is increasing and that much cutting-edge research falls into this category. It is difficult to assess, however, how much research is multidisciplinary because of limitations in SRS data. Given the potential import for such a development on how federal funds should be allocated, the division should hold a workshop or commission a study to better understand the nature and extent of this phenomenon. The workshop or study should be designed to
provide insight on how to implement changes consistently across its R&D and human resources data collection efforts in order to better capture multidisciplinary R&D when it occurs.
Allocating Federal Resources for Science and Technology
The allocation of federal resources for science and technology has been much discussed in the wake of the Cold War and substantial increases in funding for the biomedical sciences. At the request of Congress, an NRC panel examined processes for allocating federal resources for research and development and suggested that Congress and the Executive Branch focus on funding trends for the "federal science and technology budget" (FS&T), or that part of R&D spending that focuses on the "creation of new knowledge or technologies" and excludes the testing and evaluation of new weapons systems. SRS could take steps to better support analysis of this concept by requesting that the Department of Energy (DOE) and the National Aeronautics and Space Administration (NASA) break out the FS&T portion from their aggregate budget and obligation figures as does the Department of Defense (DOD).
SRS should also continue to takes steps to investigate and reconcile discrepancies in R&D funding data obtained by its different surveys since they hamper analyses of federal funding. For example, the Survey of Federal Funds for Research and Development estimates that federal R&D obligations to academic institutions in 1997 was $12.2 billion while the Survey of R&D Expenditures at Universities and Colleges estimates federally-funded R&D expenditures in that year to be $14.1 billion. Similarly, the Federal Funds Survey estimates a decrease of 32 percent in federal obligations to academic institutions for electrical engineering research between 1993 and 1997, while the academic R&D survey estimates for that period an increase in federally-funded academic R&D in that field of 27 percent. Some of this discrepancy may be accounted for in the difference between counting research as opposed to research and development, but not all. SRS needs to resolve these discrepancies to improve the credibility of the data among analysts.
SRS should continue to pursue changes to its Survey of Scientific and Engineering R&D Facilities at Colleges and Universities to provide better data for assessing overhead rates at research universities and estimating future academic infrastructure needs. The collection of improved data on academic facilities to assist OMB in this effort is an important test of SRS's ability to provide data relevant to policymaking.