Uses and Users
The emergence of the portfolio of R&D expenditure surveys was based on a perceived need for the information. The R&D policies of various administrations and congressional legislation often informed that perception. The result is that the portfolios of recurring and special surveys of R&D expenditures at the start of the 21st century are somewhat eclectic. The surveys range from the flagship federal funds, industrial R&D, and academic spending surveys, which cover the major sectors of R&D activity, to the more narrowly focused survey of federal support for universities, colleges, and nonprofit institutions and the survey of scientific and engineering research facilities at colleges, to the infrequent surveys of innovation, nonprofit institutions, and state R&D spending. Each began with the intent of filling a void in the understanding of the nature and extent of R&D in the United States. Along the way, each has attracted a cadre of users and panoply of uses that extend the utility of the data beyond the original purpose. The users, in the final analysis, determine the relevance of the NSF data offerings.
It is important for NSF to fully understand and respond to the requirements of data users and, to maintain its relevance, to be more anticipatory and proactive in meeting those requirements. As the Committee on National Statistics of the National Research Council (NRC) has advised, a statistical agency’s mission should include responsibility for assessing the needs for information of its constituents in order to ensure that the data and information they provide continue to be relevant over time (National Research Council, 2001b).
AN EXPLOSION OF USES
The catalogue of uses and directory of users of the research and development expenditure data of the NSF Science Resources Statistics (SRS) Division has grown by leaps and bounds over its history. Today, SRS data are widely relied on to perform their historical role in measuring federal R&D funding and R&D performance, but increasingly they are asked to serve purposes never envisioned when the data series were initiated. This growth in the community of users and in the variety of uses has outstripped the capacity of the SRS program to provide all the data needed by all users in a fast-changing world. The limitations of the data have prevailed despite the fact that, over the years, the measures have expanded to sharpen the focus on R&D funding in colleges and universities and in the service sectors of the economy.
In order to be more responsive, NSF has attempted to expand the data in depth and detail, but the agency has been only partially successful. For example, the industrial R&D sample was increased to better capture R&D performance in small and nonmanufacturing firms. The addition of state breakdowns is another example. These refinement and catch-up efforts are generally considered to have fallen short of keeping pace with the expansion of requirements for R&D data in the recent past.
At their most basic level, and to a large extent, R&D expenditure data are important in and of themselves. The time series and periodic special studies are critical to the understanding of trends by source of funding, performer of research and development activity, type of R&D (usually following a taxonomy of basic research, applied research, and development), field of science or engineering, and geographic area (Jaffe, 1996). However, they are incomplete. They measure input, rather than output, of R&D.
Of growing importance, however, are ancillary purposes to which the SRS data have been enlisted. For example, R&D expenditure data today are a key factor in understanding significant processes for which they were not initially designed, such as innovation. Innovation has been defined as the invention, commercialization, and diffusion of new products, processes, and services; these, in turn, are taken to be an important determinant of economic growth, productivity, and welfare (National Research Council, 2001a). The recent NRC report Using Human Resource Data to Track Innovation (2001a) points out that R&D expenditure data are often taken as the best surrogate indicator of innovation, in part because of the high degree of industry and firm detail and wide industry coverage and in part because they are the most consistently collected data with annual time series that extend back for decades. Thus, they are taken to represent the best time series related to innovation. Other series, such as counts of pat-
ents, counts of contractual collaborations, counts of inventions, and counts of literature citations, are helpful but have problems in duration, coverage, and consistency that reduce their value for understanding innovation trends.
MEETING THE MULTIPLE NEEDS OF MULTIPLE DATA USERS
The long and growing list of uses and users indicates the difficulty faced by SRS in meeting its obligation as a provider of statistical information. This obligation is best depicted in the NRC publication Principles and Practices for a Federal Statistical Agency (2001b). In brief, the “best practice” guidance suggests that NSF plan its program and identify emerging issues of import by working “closely with policy analysts in its department, other appropriate agencies in the executive branch, relevant committees and staff of the Congress, and appropriate nongovernmental groups” (p. 3).
For SRS, this is no simple task. The list of constituents that they identify in their publications includes policy makers in the Executive Office of the President, particularly in the Office of Management and Budget, and in the Office of Technology and Science Policy. It includes a number of congressional committees and staff, highlighted by the Senate Committee on Commerce, Science, and Transportation; the House Committee on Science; the Joint Economic Committee; and the Congressional Research Service. The National Science Board and officials in other federal agencies utilize the data in policy formulation, while the statistical arms of the departments of Commerce and Labor use the data as input to programs of economic measurement. The policy formulation users include policy makers at the state and local levels who play a role in education and technology-based economic development. They include those who seek to inform policy, such as the National Academy of Sciences and its subordinate organizations, including the Board on Science, Technology, and Economic Policy; the Board on Higher Education and the Scientific Workforce; and the Committee on National Statistics. Various professional associations and think tanks, most prominently the American Academy for the Advancement of Science, have added to the compendium of requirements.
A growing group of hands-on SRS data users include academic administrators in the nation’s colleges and universities and planners and policy makers in industry who have much at stake in federal policies and funding programs. Academic researchers seeking to understand scientific processes and explore relevant science and technology policy issues are critically important data users. The media are also a constituency, as they serve as one means for disseminating data and issue-oriented analysis to the SRS audience. Students and faculty are SRS customers; the more informed their career and mentoring decisions, the more effective the science and engineering enterprise. And finally, international organizations such as the United
Nations and the Organisation for Economic Co-operation and Development heavily utilize the SRS data for international comparative studies.
USER REQUIREMENTS AND PRIORITIES
It is not possible, in this study, to fully explore the dimensions of the requirements imposed by this impressive variety of uses and users, nor is it possible to rank the uses and users. Nonetheless, it is important to capture some of the most crucial requirements for the consideration of the panel and to suggest some that should be addressed as important priorities.
This survey of uses and users does not need to begin at the beginning. Previous studies have concluded that SRS has served several of its multiple user-constituents reasonably well. Representatives of the user community who participated in focus groups and structured interviews for the study Measuring the Science and Engineering Enterprise generally believed SRS is already doing an excellent job (National Research Council, 2000). One individual remarked that, overall, SRS data are the “gold standard” for data related to science and technology, noting that coverage of issues is very good. Another interviewee said that SRS data are very helpful for U.S. science and technology issues: “Their domestic data are essential; they are the official statistics. Their international publications are also useful. The data in such publications as Science and Engineering Indicators, National Patterns of R&D Resources, and Federal Funds are quite detailed and very useful.” Others were not so complimentary, pointing out gaps in the data, problems of timeliness that rendered the data less useful, and concerns about the data that stem from the discrepancy between funding reported by federal providers and R&D performers.
National Science Foundation
A strategic overview should start at home and focus on two questions: What are the requirements of the National Science Foundation and the National Science Board? How well does SRS meet those needs?
The raison d’etre for the SRS data is to serve the mission of the National Science Foundation. The NSF mission is “to promote the progress of science; to advance the national health, prosperity, and welfare; and to secure the national defense.” More specifically, the statistical purpose of NSF, as clarified over the years, is to provide a central clearinghouse for the collection, interpretation, and analysis of data on scientific and engineering resources and to provide a source of information for policy formation by other agencies of the federal government. This charter unambiguously an-
chors the SRS data in the mission realm of NSF (albeit a wide-ranging realm), while directing the attention of the statisticians to meeting the needs of other federal agencies. As the programmatic interests of NSF broadened with the passage of time, so did the interpretation of the scope and function of the SRS R&D expenditure surveys. Today, the focus is broadly understood and accepted to be on policy makers, managers, educators, and researchers in the science and technology arena. Nonetheless, it is useful to go back to the origins.
The expanding scope and intent of NSF interests have largely been responsible for defining today’s portfolio of SRS R&D expenditure surveys. The initial data collection was to answer the important but still-difficult question: What is the federal government investing in R&D and how is it performing? The first data collections in the early 1950s and the Survey of Federal Funds for Research and Development, which collects data on obligations made by federal agencies, were designed to respond to that narrow inquiry. The question was soon enlarged to permit evaluation of the federal role in view of the total investment in R&D, and the industrial R&D survey was born to answer that question. The data collection net was cast wider over the next two decades, still focusing largely on measuring the federal portion of a total national investment in R&D. Surveys of R&D performance in broadly defined nonprofit institutions, and later, more specifically at major universities, followed. Subsequent additions to the portfolio have deepened and otherwise enhanced the R&D data, keeping in sight the seminal need to better understand the role of the federal government in supporting research and development. There is a tension in the SRS division as it seeks to fulfill these NSF-related objectives of the SRS R&D expenditure data while being responsive to the larger community of users when considering issues of data scope, coverage, and quality.
National Science Board
Within the NSF family, the National Science Board (NSB) is another heavy user of SRS R&D expenditure data. The range of NSB’s interests, for example, can be gleaned from a content examination of its biennial report, Science and Engineering Indicators, for which the SRS staff has primary production responsibility. The intent of this volume is to “provide a broad base of quantitative information about U.S. science, engineering, and technology for use by public and private policymakers” and “because of the spread of scientific and technological capabilities around the world … a significant amount of material about these international capabilities” (National Science Foundation, 2004:iii). The content reflects the wide-ranging interest of the board.
The 2002 volume of NSB’s Science and Engineering Indicators (here-
after referred to as Indicators) includes material on science, mathematics, and engineering education from the elementary level through graduate school and beyond; the scientific and engineering workforce; U.S. and international performers, activities, and outcomes; U.S. competitiveness in high technology; public attitudes and understanding of science and engineering; and the significance of information technologies for science and for the daily lives of U.S. citizens in schools, the workplace, and the community.
Many of the SRS R&D expenditure data series are considered important by the NSB and find their way into Indicators. The 2002 Indicators report included data from each of the R&D surveys, as well as several outside sources.
In testimony before the panel, the vice chair of the National Science Board, Anita K. Jones, strongly emphasized the role of the R&D data and the Indicators publication in national science and technology policy making. She focused on the needs of two groups of policy makers: political appointees in the R&D arena at the federal and state levels, as well as scientists and engineers on advisory boards. These boards are of several types. They include the presidential boards, such as the National Science Board and the President’s Council of Advisors on Science and Technology, the NSF directorate advisory committees, agency advisory committees, and National Academies committees. She pointed out that Indicators was a leading resource for an R&D policy maker, in that the data are sound, the definitions and categories are stable, and the longitudinal data provide a useful perspective. In particular, she took issue with the recommendation in a previous National Research Council report, Measuring the Science and Engineering Enterprise, which recommended making Indicators “smaller … and less duplicative of other SRS publications.” Her testimony urged that a smaller Indicators would not be in the best interest of the audience because Indicators is the “one-stop shop” for policy makers who do not study the multitude of R&D statistics publications (Jones, 2003).
Office of Management and Budget
SRS data are widely used by officials in the U.S. Office of Management and Budget (OMB). In turn, OMB publishes data on R&D that are not directly comparable to the NSF data. OMB data are based on federal budget authority by functional category, while NSF data report federal obligations. OMB data also include budget authority for R&D plant, a category that is not included in the NSF data.1 The primary coin of the
Budget Authority. The authority provided by federal law to incur financial obligations that will result in outlays.
Obligations. The amounts for orders placed, contracts awarded, services received, and similar transactions during a given period, regardless of when funds were appropriated or payment required.
Outlays. The amounts for checks issued and cash payments made during a given period, regardless of when funds were appropriated or obligated.
realm for OMB is “budget authority,” in that OMB manages and reports the federal budget-by-budget authority classifications. It is important to understand the distinction between the various measures of R&D activity: budget authority, obligations, outlays, and expenditures (see Box 2-1).
In managing the budget of the federal government, OMB has historically divided R&D budget authority into three categories: basic research, applied research, and development. These classifications are coordinated with the National Science Foundation and are developed with a view toward compatibility with international standards and definitions (see Box 2-2).
Research and Development Activities. Creative work undertaken on a systematic basis in order to increase the stock of knowledge, including the knowledge of man, culture, and society, and the use of this stock of knowledge to devise new applications.
Basic Research. Systematic study directed toward fuller knowledge or understanding of the fundamental aspects of phenomena and of observable facts without specific applications toward processes or products in mind.
Applied Research. Systematic study to gain knowledge or understanding necessary to determine the means by which a recognized and specific need may be met.
Development. Systematic application of knowledge or understanding, directed toward the production of useful materials, devices, and systems or methods, including design, development, and improvement of prototypes and new processes to meet specific requirements.
More recently, additional classifications of the allocation of research funds have been added: congressional direction; inherently unique; merit reviewed with limited competitive selection; merit reviewed with competitive selection and internal evaluation; and merit reviewed with competitive selection and external (peer) evaluation (U.S. Office of Management and Budget, 2003c). OMB also collects and publishes budget authority data on cross-cutting issues identified by the National Science and Technology Council; the U.S. global change research program; networking and information technology; and the national nanotechnology initiative.
The R&D budget authority data published by OMB in the president’s budget request are closely monitored by the R&D community. The regular analysis of these data in the president’s budget request by the American Association for the Advancement of Science is documented below.
There are several issues that recur in assessing the validity and reliability of the OMB budget authority data. The primary issue has to do with the difference between budget authority and expenditures. Budget authority may not always be translated into spending. For many reasons, programs develop on schedules that deviate from those envisioned in the president’s budget, so authority may not translate into spending. Similarly, for many capital programs, budget authority may overlap fiscal years, so the relationship between authority and spending in any single year may deviate considerably. There is also some evidence that, despite common definitions, various federal agencies have different classifications of similar activities, and these differing classifications have most influence on the division between basic and applied research. Finally, it is difficult to compare levels of effort from year to year. Programs move through the sequence from basic to applied research, to development, and then to implementation over the course of several years. As major programs like the space station move through this cycle, the data may change dramatically.
Office of Science and Technology Policy and President’s Council of Advisors on Science and Technology
Use of the NSF data by the Office of Science and Technology Policy (OSTP) varies in keeping with evolving emphasis of this agency. In the recent past, direct utilization of NSF data has been limited but, through the facilities of the National Science and Technology Council, for which it has day-to-day operating responsibility, and of the President’s Council of Advisors on Science and Technology (PCAST), OSTP turns out to be an extensive user of NSF data. A recent example of the use of the SRS expenditure data is to be found in the PCAST report Assessing the U.S. R&D Investment (President’s Council of Advisors on Science and Technology, 2002).
The report relies on both budget authority and expenditure data to
spotlight trends in R&D investment by the public and private sectors and the shift of emphasis from government to industry investment, which would, in turn, have the consequence of shifting focus from research to development investment.
Two authorizing congressional committees make extensive use of SRS R&D data in their work. The Senate Committee on Commerce, Science, and Transportation, and the House of Representatives Committee on Science have jurisdiction over nondefense federal scientific research and development programs, and specifically over the programs of the National Science Foundation. SRS expenditure data are often used in committee reports and in testimony before these committees. For example, the House Science Committee’s 1998 report, Unlocking Our Future: Toward a New National Science Policy, directly referred to a number of data series from the R&D surveys and extensively quoted expert testimony that, in turn, drew heavily on the R&D data (U.S. Congress, House, 1998). In testimony before our panel, David Goldston, chief counsel of the Senate science committee, supported a robust program of R&D data, explaining that the current data are heavily relied on to support congressional decision making.
The Joint Economic Committee has utilized NSF R&D expenditure data in several studies over the past several years that have examined the role of R&D and investment in economic growth and prosperity. Two major studies in 1999, The Growing Importance of Industrial R&D to the U.S. Economy and a report on a national high-technology summit, American Leadership in the Innovation Economy, built their analysis on extensive use of SRS data (U.S. Congress, Joint Economic Committee, 1999a, 1999b). More recently, SRS data on R&D in information technology were utilized as the basis for a report, Information Technology in the New Economy. The data types used in these reports include federal and nonfederal R&D as a proportion of gross domestic product (GDP), R&D by major industry group, and international comparisons of R&D expenditures.
The Congressional Research Service (CRS) often relies on the R&D survey data to answer congressional requests. For example, at the request of the Senate Commerce, Science, and Transportation Committee, the CRS conducted a workshop the addressed the collection and reporting of federal R&D funding data to NSF for its various R&D funding surveys. The report, Challenges in Collecting and Reporting Federal Research and Development Data (Congressional Research Service, 2000), remains one of the principle sources of independent assessment of the scope, content, and quality of the SRS data. The report summarized the different aspects of the various disparities that occur in the reporting of federal R&D data by NSF
and examines central issues regarding the collection and reporting of the data. The report points out several systemic problems with the collection of R&D data from federal agencies, concluding that reporting R&D data is a burden with little benefit to the agency (Congressional Research Service, 2000). The report also suggests that the fields-of-science categories have become less representative over time because of the changing nature of research. The age-old dichotomy between maintaining currency and maintaining a historical time series comes into play for SRS when confronting this and other issues involved in modernizing the data series.
The CRS use of the federal funds survey was discussed in a November 1998 NSF agency workshop on federal R&D. While expressing general satisfaction with the data, the CRS representative requested a breakdown of federal R&D funding by congressional district. He also urged presentation of data by budget authority rather than outlays and obligations, and in a manner compatible with the organization of the congressional appropriation process, that is, with data disaggregated by the nine congressional appropriation categories. Without such breakouts, R&D funding is forced to compete with other budget priorities rather than to be considered as a separate class of spending across the appropriation categories. Finally, he suggested that the data would be more useful if they were more timely (Quantum Research Corporation, 1999:4-5).
The U.S. General Accounting Office (GAO) uses SRS R&D data to respond to requests for information from members and committees of Congress. In May 2001, GAO studied the reported gap between NSF data reported on obligations of federal agencies for R&D support and the amount that performers (including industries, universities, and other nonprofit organizations) reported spending (U.S. General Accounting Office, 2001). Congress was concerned about this gap, which stemmed, in the view of GAO, from comparing two dissimilar types of data, not from poor-quality data, nor from a systemic problem in receiving and spending federal funds.
University administrators are also perennial users of SRS data. Administrators use SRS data on university R&D and academic facilities, primarily as a basis of comparison of their own programs with those of their peers and competitors. These uses are particularly important in public colleges and universities, where state legislative oversight is often aided by these comparative data. In SRS surveys for which university administrators are respondents, SRS obtains very high response rates—around 95 percent—because these administrators, in turn, use the data.
SRS data show up prominently in reports prepared by various other organizations. The American Association for the Advancement of Science primarily uses data from the OMB and from federal agencies in its annual report on federal research and development spending in the president’s budget. NSF data are used to provide a historical context for these more current data. In a report on the “new economy” prepared by the Progressive Policy Institute, 5 of 39 indicators drew on SRS data from either Science and Engineering Indicators or National Patterns of R&D Resources, compared with three drawn from the Economic Report of the President and seven drawn from Bureau of Labor Statistics data (Atkinson and Court, 1998). Likewise, the Committee for Economic Development recently released a report, America’s Basic Research, in which almost 90 percent of the data in the report’s tables and figures were SRS data drawn from either SRS publications or NSB’s Science and Engineering Indicators (Committee for Economic Development, 1998).
Industrial R&D Research Analysts
Over the past decade, the industrial R&D research community has focused increasingly on understanding how firms organize to perform R&D, with the ultimate goal of identifying types of operations that achieve higher productivity. This may be considered a micro-level version of the work being conducted by the Bureau of Labor Statistics (BLS). For these purposes, data with considerable detail are needed—information on connections of firms with other firms, with academia, and with government laboratories. The structure of the R&D workforce within an organization is also of interest, with a focus on degree levels and degree fields employed in the R&D venture. The SRS staff has responded by increasing the amount of detail, although mindful to balance the increase in detail with the increase in respondent burden. Beginning with data for 2001, the NSF industry R&D survey will collect detail on extramural R&D (R&D performed outside the company but inside the United States) by type of contract organization: for-profit, university, and nonprofit. The academic R&D expenditure survey initiated collection of pass-through R&D funding, that is, total dollar amounts sent to (starting in 1996) and received from (starting in 2000) higher education and all other institutions. The one-time information technology innovation survey, in progress this year, delves into these issues in considerable depth.
Consistent with this demand for more internal detail on organizations, academic R&D researchers have urged SRS to collect data at lower levels of aggregation—plants, not firms—and by line of business rather than SIC
major industry grouping. This requirement for fine-sizing of the data has significant implications for the sample size and the sample frame selected for the survey and, in consequence, on the overall design and cost of the surveys.
Over the years, academic researchers and analysts have made significant contributions to both the understanding of the R&D enterprise and its measurement. In addition to sorting out the role of R&D in productivity measurement, contributions have been notable in conceptualizing and measuring innovation, in understanding the new organizational structures for the conduct of R&D, and in focusing attention on the international aspects of R&D. The academic research community has been instrumental in illuminating avenues of inquiry and in challenging NSF to improve the measures.
Other Potential Users
The panel thinks that there is the potential to expand the demand for SRS data in the private sector. Members of the industrial business community could become SRS data users if weaknesses in data publication are addressed. In particular, business firm representatives to the Industrial Research Institute have indicated that, if the industrial R&D data were reported on a more timely basis and were disaggregated by line of business, they would use the data for benchmarking. Response rates and accuracy might well improve if respondents to the industrial R&D survey think that the data can be of some use to them.
OTHER GOVERNMENT USES
National Income and Product Accounts
A company could build a new plant to produce one of its products, so a tracking of company expenditures would note the new investment, and analysts could relate the new investment to the production of the product. Such investments are captured by the Bureau of Economic Analysis (BEA) in its System of National Accounts. If a company invests in new plants for two distinct products, BEA would prefer that data collection activities track the investments and their effects separately for the two products.
Since the Census Bureau collects data for the NSF industry survey at the operating establishment level, much of the need at BEA for separate data on separate products is satisfied. Both production data for the industry R&D survey and capital expenditures data are available at the five-digit NAICS (four-digit SIC) industry level.
By extension, national income and product account analysts would like R&D investment data at the same level of detail and collected in much the
same manner. Such an approach would enable them, for example, to put the replacement of an old machine with a new one, and the replacement of an old way of using the machine with a new method, on the same basis. No such data are collected. Companies do not report R&D investment data to any government agency, nor do they report R&D expense data for separate industries. There are serious doubts about whether all R&D expenditures can be separated at the 5-digit NAICS detail, particularly expenditures for basic research, which may benefit several detailed industry groups.
In BEA’s input-output accounts, neither current expenses nor receipts for R&D are identified at the published level of detail. A portion of R&D is identified at the level of detail at which the estimates are prepared (Bureau of Economic Analysis, 1994:42, fn. 17).
BEA has extensively utilized SRS data in developing a “satellite” account for research and development to supplement the existing national accounts. First introduced in 1994 (Bureau of Economic Analysis, 1994), the satellite account provides estimates of expenditures on R&D that are used in conjunction with the national income and product accounts. The satellite account treats R&D expenditures as a form of investment, recognizing the role R&D plays in adding to knowledge and in developing new and improved processes and products that lead to increases in productivity and growth. The satellite account provides estimates of the stock of knowledge capital.
The estimates of R&D expenditures in the satellite account are based on four of the SRS surveys: Federal Funds for Research and Development, Federal Science and Engineering Support to Universities, Colleges, and Nonprofit Institutions, Research and Development Expenditures at Universities and Colleges, and Industrial Research and Development. The accounts also used input from the SRS surveys of state and local and nonprofit institution R&D expenditures. Several significant adjustments to the SRS data were required to make them malleable to the national accounts, including removal of expenditures for R&D structures and equipment, converting fiscal to calendar years and federal obligations to expenditures, and, importantly, substituting judgmental estimates for R&D data that have been suppressed by NSF to avoid disclosure of confidential data.
A more recent study (Fraumeni and Okubo, 2002) has explored options for going a step further to actually include R&D in the official national accounts. The proposal to capitalize R&D expenditures would raise investment on the product side of the accounts, thus raising GDP, while changing its composition (investment rises and consumption falls while the level and rate of savings increase). In proposing these changes, the authors recognize several key limitations in the underlying SRS R&D expenditure data for this expanded purpose. These recognized limitations lead to the
expectation that, if this important revision to treatment of R&D in the national accounts were introduced, it would greatly increase the visibility of the SRS data series, putting additional pressure on the quality and timeliness of the data.
Good productivity estimates need good estimates of R&D. Much of what economists know empirically about the relation between R&D and productivity is based on studies that regress data at the company level on the R&D numbers along with ratios of other inputs (labor, tangible capital, etc.) (Griliches, 1980). The model is forced to assume that each firm operates in a single industry.
Studies that have used lines of business as individual observations have not as a rule recalculated regression equations after aggregating up to the company level. Those that have used company-level data and would have used line-of-business data if they had had them typically do not know enough to model the regressions to ameliorate the negative econometric effects. For studies of the effects of R&D on productivity, the general effect may be to show a smaller effect or a less significant effect.
Multifactor Productivity Estimates
The Bureau of Labor Statistics utilizes SRS R&D expenditure data in development of estimates of productivity in the nonfarm economy. The data are used as input to the BLS annual multifactor productivity report (U.S. Bureau of Labor Statistics, 2002). In the computation of multifactor productivity, the stock of research and development in private nonfarm business is derived by cumulating constant-dollar measures of research and development expenditures and allowing for depreciation. The current dollar expenditures levels for privately financed R&D are obtained from the NSF industry R&D series, while price deflators and estimates of depreciation are developed by BLS.
The BLS data on multifactor productivity deal only with the direct return to research and development, that is, productivity gains by industries identified by the amount of R&D they do for themselves. The indirect effects of research and development obtained by purchasers further along the chain of production are likely to be significant. In order to flesh out these series, BLS has conducted research in utilizing NSF data to identify the “indirect effects” of R&D by constructing indexes of the R&D intensity of product producers and using BLS capital data and input-output tables to regress the productivity of downstream users of these products on R&D intensity. As reported in a 1989 study, The Impact of Research and Devel-
opment on Productivity Growth, in order to better identify and break down these indirect effects, BLS would prefer to use data by detailed industry, much like the line-of-business series produced by the Federal Trade Commission, determine the precise effect of federal R&D expenditures on private nonfarm productivity (U.S. Bureau of Labor Statistics, 1989).
Occupational Employment Projections
The Bureau of Labor Statistics also makes extensive use of the NSF industry data in its program of occupational employment projections. BLS staff report that more extensive use of the data would be made if the NSF data (1) provided a measure for total technology-oriented resources; (2) were further disaggregated by industry category; and (3) were a measure of output rather than intensity of effort. Despite these shortcomings, it is expected that BLS use of the NSF data will grow in the future because a competing series derived from the Occupational Employment Survey has been discontinued.
Integration of R&D activities into the national accounts and for productivity estimates are not the only interest in R&D at the national level. Both tax credit and subsidy policies are important, and the details in each of those areas can depend on differences across industries. Currently not enough reliable data on industries are available to enable policy makers to tailor tax credits by industry. Subsidies are typically keyed to broad industry categories: the Defense Advanced Research Projects Agency targets defense applications, grants flow to specific broad areas via appropriations to the National Institutes of Health and the departments of Energy, Transportation, and Homeland Security. Even general-purpose grant programs like NSF and the National Institute of Standards and Technology’s Advanced Technology Program are organized along fairly specific technology areas.