National Academies Press: OpenBook

Capturing Change in Science, Technology, and Innovation: Improving Indicators to Inform Policy (2014)

Chapter: 5 Measuring the Three K's: Knowledge Generation, Knowledge Networks, and Knowledge Flows

« Previous: 4 Measuring Innovation
Suggested Citation:"5 Measuring the Three K's: Knowledge Generation, Knowledge Networks, and Knowledge Flows." National Research Council. 2014. Capturing Change in Science, Technology, and Innovation: Improving Indicators to Inform Policy. Washington, DC: The National Academies Press. doi: 10.17226/18606.
×

5

Measuring the Three K’s: Knowledge Generation, Knowledge Networks, and Knowledge Flows

Knowledge generation can occur formally through directed research and experimental development in academic institutions, firms, and public and nonprofit institutions. Knowledge generation can also occur informally in a working environment through the activities and interactions of actors in an organization or the general economy. People are the critical input for knowledge generation, whether as individual researchers; in research teams; or even in collectives such as organizational subunits, entire organizations, or nation-states.1 Therefore, indicators of knowledge generation focus on attributes of human capital inputs and related outputs. Knowledge can be acquired by using codified (written) sources such as publications or patents, or in tacit form by hiring people with the needed knowledge or participating in networks where the knowledge is stored (Chapter 6 focuses on knowledge embodied in people). Knowledge can be both an intermediate input and a final output and can depreciate over time.2

Knowledge networks link actors, organizations, and technologies in the global economy, revealing new discoveries and transferring knowhow on the development of new techniques, processes, and at times breakthroughs that can be commercialized (Chapter 4 focuses on innovation). Knowledge networks include research collaborations, coinventorships, coauthorships, and strategic alliances.3Knowledge flows transmit across knowledge networks and point to comparative advantage, presence in other markets, and access to foreign technologies. To use acquired knowledge, recipients must have absorptive capacity.4

Knowledge generation, diffusion, and use, as well as conduits for knowledge flows, are all key elements for economic growth (Romer, 1990). Therefore, it is critically important for the National Center for Science and Engineering Statistics (NCSES) to produce indicators of these varied dimensions of knowledge at the national, international, and subnational levels.

Quite a few data elements, such as research and development (R&D), patents, bibliometrics, and trade in technology, capture knowledge generation, networks, and flows (referred to as “the three K’s”). NCSES has been collecting these data for several decades in order to publish indicators on these topics, drawing on both its own and other data sources, such as the Bureau of Economic Analysis for data on global multinational R&D activities. International R&D is well covered by NCSES’s Business Research and Development and Innovation Survey (BRDIS). While NCSES has good measures of knowledge creation, however, a number of complex issues remain unaddressed, and challenges for measurement remain in the area of knowledge flows.

Therefore, the purpose of this chapter is to discuss the dynamics and outcomes of scientific R&D. To illustrate specific uses of science, technology, and innovation (STI) indicators in this context, the focus is on the policy questions that can be addressed using indicators on the three K’s; however, it should be noted that these indicators have several other uses. Box 5-1 highlights key policy questions relating to the generation and transfer of knowledge.5 While raw data on R&D expenditures and patent citations are useful for understanding whether the United States is falling behind other countries in R&D expenditures and outcomes, more sophisticated statistics are required to address other

____________________

1See Phelps and colleagues (2012, p. 7) for a description of repositories of knowledge. Romer (1990, p. S84) makes the following distinction between knowledge as an intermediate and final output: “… knowledge enters into production in two distinct ways. A new design enables the production of a new good that can be used to produce output. A new design also increases the total stock of knowledge and thereby increases the productivity of human capital in the research sector.”

2See Huang and Diewert (2011) for methods of measuring knowledge depreciation.

3For an extensive definition of knowledge networks, see Phelps et al. (2012, p. 61, endnote 1).

4OECD (2013a) gives definitions of knowledge flows and classifications of indicators of knowledge flows in science, technology, and innovation sectors.

5See Appendix B for the full list of policy questions.

Suggested Citation:"5 Measuring the Three K's: Knowledge Generation, Knowledge Networks, and Knowledge Flows." National Research Council. 2014. Capturing Change in Science, Technology, and Innovation: Improving Indicators to Inform Policy. Washington, DC: The National Academies Press. doi: 10.17226/18606.
×

BOX 5-1
Policy Questions Related to Knowledge Generation, Networks, and Flows

  • What new technologies or fields are emerging from current research?
  • Is the United States promoting platforms in information and communication technology, biotechnology, and other technologies to enable innovation in applications?
  • Is the United States falling behind other countries in R&D expenditures and outcomes?
  • How much are U.S. companies spending to be present in emerging markets? How much R&D are they conducting in these nations?
  • Is the United States losing or gaining advantage by buying and selling its R&D abroad?
  • Is the United States benefiting from research conducted in other countries?

issues pertaining to the competitiveness of U.S. companies and the benefits of buying and selling R&D internationally. The focus of this chapter is on the latter set of indicators.

A recent OECD (2012c) study titled Knowledge Networks and Markets in the Life Sciences describes key aspects of the three K’s in which indicators require further development. The following findings are particularly in accord with those presented in this chapter:

  • Individuals, firms, and countries are not uniformly linked to knowledge networks.
  • Evidence gaps persist with respect to capturing differences between knowledge production and use (as in the case of R&D), capturing partnerships and their financial dimension, monitoring the combined outward and inward dimensions of knowledge flows, and going beyond intellectual property indicators as measures of knowledge outputs.
  • Measurement standards need to be adapted if improvements are to be achieved in the interoperability of STI data sources across different domains, such as R&D, patents, other forms of registered intellectual property, scientific publications, innovation survey data, and administrative sources. Solutions need to be developed that address the impact of knowledge flows on the interpretation, relevance, and international comparability of existing STI indicators.

NCSES is poised to make important contributions to the improvement of indicators on the three K’s. Collaborative efforts with other agencies in the United States and abroad should be fruitful for this endeavor.

CODIFIED DEFINITIONS

The internationally accepted definition of “research and experimental development”—more commonly referred to as R&D—comes from OECD (2002, p. 30): “creative work undertaken on a systematic basis in order to increase the stock of knowledge, including knowledge of man, culture and society, and the use of this stock of knowledge to devise new applications.”6 In BRDIS, NCSES expands on this definition, providing the following guidance (U.S. Department of Commerce, 2011, p. 3):

R&D is planned, creative work aimed at discovering new knowledge or developing new or significantly improved goods and services. This includes (a) activities aimed at acquiring new knowledge or understanding without specific immediate commercial applications or uses (basic research); (b) activities aimed at solving a specific problem or meeting a specific commercial objective (applied research); and (c) systematic use of research and practical experience to produce new or significantly improved goods, services, or processes (development).

The term “research and development” does NOT include expenditures for:

  • costs for routine product testing, quality control, and technical services unless they are an integral part of an R&D project;
  • market research;
  • efficiency surveys or management studies;
  • literary, artistic, or historical projects, such as films, music, or books and other publications; and
  • prospecting or exploration for natural resources.

The term “science and technology” (S&T) covers a wide range of activities, including R&D, but is rarely defined in the literature, perhaps because its breadth leads to its being used in different ways in different contexts. The United Nations Educational, Scientific and Cultural Organization (UNESCO) (1984, p. 17) provides a definition of the term that is used for this chapter:

For statistical purposes, Scientific and Technological Activities (STA) can be defined as all systematic activities which are closely concerned with the generation, advancement, dissemination, and application of scientific and technical knowledge in all fields of science and technology, that is the natural sciences, engineering and technology, the medical and the agricultural sciences (NS), as well as the social sciences and humanities (SSH).7

____________________

6This is the definition used in all OECD, European Union, African Union, and Latin American countries. All elaborate on this definition in their survey instruments as the United States has done to incorporate definitions for basic, applied, and experimental development.

7Also included in the

Suggested Citation:"5 Measuring the Three K's: Knowledge Generation, Knowledge Networks, and Knowledge Flows." National Research Council. 2014. Capturing Change in Science, Technology, and Innovation: Improving Indicators to Inform Policy. Washington, DC: The National Academies Press. doi: 10.17226/18606.
×

Because S&T includes but is not limited to R&D,8 the focus of this chapter is on indicators of foreign direct investment in R&D and trade in knowledge-intensive services. Measurement of intangible assets also is touched upon, although the panel does not view the development of such measures as more appropriate for NCSES than for the Bureau of Economic Analysis.

MEASURING SCIENCE AND TECHNOLOGY: MAJOR GAPS IN INTERNATIONAL COMPARABILITY

Comparability is a universal challenge for statistics and for indicators based on those statistics. The comparability of data can be affected by the survey techniques used to collect the data and the conversion of the data into statistics through the use of weighting schemes and aggregation techniques. These problems are amplified when statistics are used to create indicators, as the indicators may be a combination of statistics (e.g., an average, a sum, or a ratio) with different comparability problems. In addition to the international or geographic comparison of indicators that describe an aspect of a system (e.g., R&D as a percentage of gross domestic product [GDP]), there are problems with intertemporal and intersectoral comparisons. Users of indicators need to recognize that all statistics and indicators have a margin of error beyond which they should not be pushed. The problem is growing as response rates to official surveys continue to decline.

International comparisons entail fundamental issues such as language (e.g., the Japanese term for “innovation” is actually closer to what most Americans think of as “technology”), and NCSES is to be congratulated for supporting a project with OECD and the European Union (EU) on the cognitive testing of survey questions in multiple languages. Differences in institutions (e.g., the accounting for the European Union Framework program across EU member states) pose problems, as do cultural differences (e.g., the Nordic world has access to “cradle to grave” linked microdata on individuals) and differences in governance structures (e.g., the importance of subnational R&D programs in some countries). These differences can limit comparability and increase the margin of error that should be applied to international comparisons of statistics and indicators.

In the area of S&T indicators, a number of key comparability problems are well known. OECD compiles S&T statistics, monitors the methodology used to produce them, and publishes international comparisons and has documented the problems summarized below.

Research and Development9

Each country depends for its R&D data on the coverage of national R&D surveys across sectors and industries. In addition, firms and organizations of different sizes are measured, and national classifications for firm sizes differ. Countries also do not necessarily use the same sampling and estimation methods. Because R&D typically involves a few large organizations in a few industries, R&D surveys use various techniques to maintain up-to-date registers of known performers. Analysts have developed ways to avoid double counting of R&D by performers and by companies that contract with those firms or fund R&D activities of third parties. These techniques are not standardized across nations.

R&D expenditure data for the United States are somewhat underestimated for a number of reasons:

  • R&D performed in the government sector covers only federal government activities. State and local government establishments are excluded from the national figures.10
  • In the higher education sector, R&D in the humanities is excluded, as are capital expenditures.11
  • R&D expenditures in the private nonprofit sector include only current expenditures. Depreciation is reported in place of gross capital expenditures in the business enterprise sector.

Allocation of R&D by sector poses another challenge to the comparability of data across nations. Using an industry-based definition, the distinction between market and public services is an approximate one. In OECD countries, private education and health services are available to varying degrees, while some transport and postal services remain in the public realm. Allocating R&D by industry presents a challenge as well. Some countries adopt a “principal activity” approach, whereby a firm’s R&D expenditures are assigned to that firm’s principal industrial activity code. Other countries collect information on R&D by “product field,” so the R&D is assigned to the industries of final use, allowing reporting companies to break expenditures down across product fields when more than one applies. Many countries follow a combination of these approaches, as product breakdowns often are not required in short-form surveys.

____________________

definition of S&T are “scientific and technological services” and “scientific and technological education and training,” the definitions of which are found in United Nations Educational, Scientific and Cultural Organization (1978).

8The OECD Frascati Manual (OECD, 2002, p. 19) notes that “R&D (defined similarly by UNESCO and the OECD) is thus to be distinguished from both STET [scientific and technological education and training] and STS [scientific and technological services].” The Frascati definition of R&D includes basic research, applied research, and experimental development, as is clear from NCSES’s presentation of the definition in the BRDIS for use by its respondents.

9This description draws heavily on OECD (2009, 2011) and Main Science and Technology Indicators (MSTI) (OECD, 2012b).

10NCSES reports state R&D figures separately.

11In general, OECD’s reporting of R&D covers R&D both in the natural sciences (including agricultural and medical sciences) and engineering and in the social sciences and humanities. A large number of countries collect data on R&D activities in the business enterprise sector for the natural sciences and engineering only. NCSES does report data on social science R&D.

Suggested Citation:"5 Measuring the Three K's: Knowledge Generation, Knowledge Networks, and Knowledge Flows." National Research Council. 2014. Capturing Change in Science, Technology, and Innovation: Improving Indicators to Inform Policy. Washington, DC: The National Academies Press. doi: 10.17226/18606.
×

The Frascati Manual (OECD, 2002) recommends following a main activity approach when classifying statistical units, but recommends subdividing the R&D by units or product fields for firms carrying out significant R&D for several kinds of activities. This applies to all industry groups and, at a minimum, to the R&D industry (International Standard Industrial Classification [ISIC] Rev. 3, Division 73, or North American Industry Classification System [NAICS] 5417 in North America), although not all countries follow this method.

Comparability problems are also caused by the need to preserve the confidentiality of survey respondents (see Chapter 4). National statistical practice will prevent publication of the value of a variable if it is based on too few responses. This not only results in suppression of a particular cell in a table, but also requires additional suppression if there are subtotals that could be used to infer the suppressed information. The result is reduced comparability, which can be overcome only by microdata analysis under controlled conditions.

In principle, R&D institutes serving enterprises are classified according to the industry they serve. When this is not done, the percentage of business enterprise expenditure on R&D (BERD) performed by what is most likely a service industry is overestimated compared with estimates for other countries.

Finally, R&D performers recently have been asked in surveys to break down their R&D activities across sites in different national territories or regions. Estimating R&D intensity by region or other subnational unit presents additional challenges. The existence of multinationals headquartered in a given country that conduct R&D and trade in R&D services worldwide makes it difficult to pinpoint where the R&D is funded and performed and where it has impact. For example, the R&D could be funded by a head office in Rome, performed in a research institute in Israel, and have an impact on consumers of the resulting product in the United States.

Government Budget Appropriations or Outlays for R&D (GBAORD)12

GBAORD data are assembled by national authorities using statistics collected from budgets. This process entails identifying all the budget items involving R&D and measuring or estimating their R&D content. The series generally cover the federal or central government only. GBAORD is a good reflection of government priorities based on socioeconomic objectives. These statistics often are used for cross-country comparisons, particularly to address such questions as: Is the United States falling behind other countries in R&D expenditures and outcomes? While it is not necessarily the case that high government expenditures foreshadow international preeminence in S&T, it is important to understand whether such expenditures indeed lead to better employment, health, and security outcomes.

However, comparability problems arise because some countries do not include in their GBAORD estimates funding for general support of universities (e.g., the United States) or R&D funded as part of military procurement (e.g., Japan, Israel). Moreover, it currently is not possible for all countries to report, on the basis of budget data, which sectors are responsible for performing the R&D funded by government.

Business Enterprise Expenditures on R&D13

BERD statistics convey business R&D expenditures. OECD breaks down business R&D expenditure data into 60 manufacturing and service sectors for OECD countries and selected nonmember economies. The reported data are expressed in national currencies (as well as in purchasing power parity U.S. dollars), at both current and constant prices.

When assessing changes in BERD over time, it is necessary to take account of changes in methods and breaks in series, notably in terms of the extension of survey coverage, particularly in the service sector, and the privatization of publicly owned firms. Identifying new and occasional R&D performers is also a challenge, and OECD countries take different approaches to this challenge in their BERD surveys. In addition, not all activities related to foreign affiliates’ R&D are recorded in company transactions. There are intracompany transfers (e.g., intracompany mobility of researchers) with no monetary counterparts that lead to R&D efforts that do not appear in the statistics as R&D spending by foreign affiliates. The increasing internationalization of R&D and other economic activities also makes it difficult to accurately identify inflows of R&D funds to companies and their precise nature (as discussed later in this chapter). For example, there is a growing need to measure international R&D transactions properly and to deal with the problem of nonpriced transfer of R&D within multinational enterprises. All of these issues require expert data manipulation and statistical analysis, thereby presenting challenges to the international comparability of indicators derived from these statistics.

Technology Receipts and Payments14

Technology receipts and payments, including those for R&D services, show a country’s ability to sell technology abroad and its use of foreign technologies, respectively. Further qualitative and quantitative information is needed to analyze a country’s deficit or surplus because a deficit (surplus) on the technology balance does not necessarily indicate the lack (presence) of competitiveness.

____________________

12This section is based on OECD (2011) and OECD (2012b).

13This section is based on OECD (2011).

14This section is based on OECD (2011).

Suggested Citation:"5 Measuring the Three K's: Knowledge Generation, Knowledge Networks, and Knowledge Flows." National Research Council. 2014. Capturing Change in Science, Technology, and Innovation: Improving Indicators to Inform Policy. Washington, DC: The National Academies Press. doi: 10.17226/18606.
×

Measurement errors may lead to underestimation or overestimation of technology transfers. Licensing contracts provide payment channels other than technology payments, and payment/receipt flows may be only part of the total price paid and received. Alternatively, national tax and control regulations on technology receipts and payments may bias data on technology flows, notably for international transfers of multinationals. If royalties are less taxable than profits, then they may be preferred to other transfer channels and exceed the value of technology transferred. On the other hand, if limitations are imposed on royalty remittances, then some portion of repatriated profits will represent remuneration of technology transfer.

Summary

Each of the above reasons for international incomparability of some S&T measures goes beyond what NCSES can deal with on its own. An OECD Working Party, the National Experts on Science and Technology Indicators (NESTI), has been in place for 50 years to discuss these issues and support collaboration to resolve them. Nonetheless, there are some areas in which NCSES has opportunities to adjust definitions and improve methodologies to obtain more accurate STI indicators. For example, finer-grained size classes for firms would allow a better understanding of the relationship between firm size and innovation (as discussed in Chapter 4). In addition, improved measures of business enterprise R&D would shed some light on the question of whether the United States is increasingly depending on knowledge generated in other countries. And better measuring of technology receipts and payments would show which countries are net buyers or sellers of knowledge-intensive services. Recommendations for how NCSES could go about improving these measures appear later in this chapter.

TRADITIONAL INDICATORS OF THE THREE K’S

Patent15 data and bibliometrics (data on publication counts and citations) can be used to measure new knowledge, knowledge networks, and knowledge flows.

Patents

Patent administrative records—including citations, claims, technical classifications, families,16 and countries where the patents are effective—contain a wealth of information about invention. They also contain detail on inventors and applicants and on the regulatory and administrative processes of the patenting system.17 Patent information is useful for determining when a new product or process was developed and its linkages to prior inventions and to research that was the foundation for the invention. Observing where patents are registered can also yield clues to how new knowledge is diffused from nation to nation.

Patent data often are used to develop indicators of knowledge generation, flows, and linkages. OECD’s (2008) Compendium of Patent Statistics 2008 gives several examples:

  • Patent-based statistics can be derived that reflect the inventive performance of countries, regions, and firms.
  • The inventors’ addresses can be used to monitor linkages, including the internationalization of and international collaboration in S&T activities.
  • Knowledge networks can be determined by observing cooperation in research and diffusion of technology across industries or countries in specific technological areas.
  • The market strategy of businesses can be inferred from information contained in the patent file.

At the same time, information derived from patent records must be used with caution (OECD, 2006):

  • The value distribution of patents is skewed as many patents have no industrial application (and hence are of little value to society), whereas a few are of substantial value.
  • Many inventions are not patented because they are not patentable, or inventors may protect them using other methods, such as secrecy or lead time.
  • The propensity to patent differs across countries and industries.
  • Differences in patent regulations make it difficult to compare counts across countries.
  • Changes in patent law over the years make it difficult to analyze trends over time.

The panel emphasizes the first point on the above list: patents may be used strategically in some sectors of an economy to deter competition. Andrew Updegrove of

____________________

15“Patents are an exclusive right issued by authorised bodies to inventors to make use of and exploit their inventions for a limited period of time (generally 20 years). Patents are granted to firms, individuals or other entities as long as the invention is novel, non-obvious and industrially applicable. The patent holder has the legal authority to exclude others from commercially exploiting the invention (for a limited time period). In return for the ownership rights, the applicant must disclose information relating to the invention for which protection is sought” (Khan and Dernis, 2006, p. 6).

16“A patent family is the same invention disclosed by a common inventor(s) and patented in more than one country” (United States Patent and Trademark Office, http://www.uspto.gov/main/glossary/#p [June 2013]). The European Patent Office has the following definition: “A patent family is a set of either patent applications or publications taken in multiple countries to protect a single invention by a common inventor(s) and then patented in more than one country. A first application is made in one country—the priority—and is then extended to other offices” (http://www.epo.org/searching/essentials/patent-families.html [June 2013]).

17As administrative records, patent applications and grants are a rich microdata source that do not rely on surveys and do not generate the respondent burden associated with traditional statistical surveys.

Suggested Citation:"5 Measuring the Three K's: Knowledge Generation, Knowledge Networks, and Knowledge Flows." National Research Council. 2014. Capturing Change in Science, Technology, and Innovation: Improving Indicators to Inform Policy. Washington, DC: The National Academies Press. doi: 10.17226/18606.
×

Gesmer Updegrove LLP captured this sentiment by saying, “Patents don’t give value; they cause friction” (Updegrove, 2012). Therefore, the notion that substantial patent activity is an indicator of major leaps in S&T capabilities or innovation is not necessarily the case. In some instances, patenting could have a negative impact on knowledge creation and innovation. Thus observed patent activity as an indicator of knowledge generation or innovation should be determined sector by sector.

In his presentation to the panel in February 2012, Stuart Graham, chief economist at the United States Patent and Trademark Office (USPTO), outlined USPTO’s Economic Data Agenda. In the near term, the agency will improve its databases, particularly the Patent Assignment, Trademark Casefile, and Trademark Assignment datasets. Over time, USPTO is also “considering providing a forum that would facilitate the posting of additional matched datasets, papers and findings” and working with other agencies to create “matched datasets to other economically-relevant information.” For NCSES’s activities on STI indicators, particularly those related to producing better measures of knowledge generation, flows, and networks, continued collaboration with USPTO should be beneficial. NCSES already relies on USPTO data for basic measures of patenting activity. However, linking basic research outputs to patents and trademarks (including the human capital and demographic markers that are indicated on the records) and ultimately to outcomes that have significant societal impacts would be of great benefit to users of NCSES indicators. In addition, these linked files would be helpful to researchers who work with the datasets of USPTO, NCSES, and others to understand relationships and rates of return in the STI system.

The panel makes no explicit recommendation here for NCSES to do more than continue to explore wider use of patent indicators and to engage in international cooperation on the development of indicators based on patent records to address user needs. There is no standard method for calculating indicators from patent data, and as noted earlier, analysis of these data without reservation can lead to incorrect inferences and misleading policy decisions. It is important to improve data quality and analytical techniques in this area—an active role for NCSES in collaboration with other agencies and organizations worldwide. As NCSES continues to disseminate patent data as part of its STI indicators program, it would be valuable to users to have clear cautions regarding the use and misuse of these statistics for decision-making purposes.

Bibliometrics

Publication is a major vehicle for disseminating and validating research results. Bibliometric data on publication counts and citations thus are a valuable source for measuring scientific performance, tracking the development of new technologies and research areas, and mapping linkages among researchers. Publication counts are based on science and engineering (S&E) articles, notes, and reviews published in a set of the world’s most influential scientific and technical journals (Ruegg and Feller, 2003, p. 31).

A number of characteristics can be used for categorization of publications and indicator development. Fields are determined by the classification of each journal. Publications are attributed to countries by the author’s institutional affiliation at the time of publication. Indicators of coauthorship appear to be affected by two factors. The first is language, although this has become less of an issue as English has become the language most commonly used internationally by researchers. The second is geographic location, although the effect of information and communication technologies on knowledge flows has undoubtedly lessened its effect. The quality of publications can be measured both by the quality of the journal and by how often it is cited in other publications. Citations can also be used to measure knowledge flows and linkages between different research areas. Coauthorship provides an additional measure of linkages and often is used as an indicator of collaboration patterns.

NCSES currently publishes a number of indicators based on bibliometric data. These include counts of S&E articles, shares of articles with domestic or international coauthors, counts and shares of citations and top-cited articles, and citation rates. These indicators can be used primarily to measure the output of scientific research. For example, counts of articles and citations and shares of world totals show how the United States is faring compared with other countries or regions. These indicators can also be used to measure the extent of collaboration and linkage. An example is the network maps used in the report Knowledge, Networks and Nations: Global Scientific Collaboration in the 21st Century, by the UK Royal Society (The Royal Society, 2011). These network maps are based on authorship of articles and show patterns of collaboration between countries. They are based on numbers of jointly authored research papers, with linkages being displayed when the collaboration between two countries amounts to 5-50 percent of the overall publication output of one of the partners. The OECD (2010) report Measuring Innovation: A New Perspective uses citation data to measure the interrelatedness of different research areas.18

Bibliometric data potentially can be used to create a number of additional indicators to provide further detail on linkages across research areas or by geographic location. This information can be particularly valuable for mapping the development of new research areas, such as green technologies, or the spread of general-purpose technologies.

There are some limitations to the use of bibliometric analysis for the production of S&T indicators, particularly when used to measure causal relationships, such as socioeconomic impacts of funding basic science. It is also difficult to isolate how much research networks have changed because

____________________

18This report references the citation technique used in Saka et al. (2010).

Suggested Citation:"5 Measuring the Three K's: Knowledge Generation, Knowledge Networks, and Knowledge Flows." National Research Council. 2014. Capturing Change in Science, Technology, and Innovation: Improving Indicators to Inform Policy. Washington, DC: The National Academies Press. doi: 10.17226/18606.
×

of a given research funding award granted or the existence of a new collaborative agreement. Impact factors and Hirsh’s (h) index, commonly used by bibliometricians, do not allow for comparisons with counterfactual analysis. Furthermore, measures must be normalized to be helpful for comparing research outputs, or they are no better than “nose-prints”—metaphorically, signs of high window-shopping activity, with no true indication that a substantive purchase has occurred. There are ways for numbers of patents and articles to be inflated by their producers without substantive advances in S&T having been achieved. Bornmann and Marx (2013) state that “… mere citation figures have little meaning without normalization for subject category and publication year…. We need new citation impact indicators that normalize for any factors other than quality that influence citation rates and that take into account the skewed distributions of citations across papers.” Bornmann and Marx describe techniques using percentiles to create normalized indicators, an improvement on impact factors and Hirsh’s (h) index.19 To its credit, the National Science Board (for which NCSES produces the Science and Engineering Indicators [SEI] biennial volumes) is mentioned by Bornmann and Marx as one of the federal agencies that uses percentile ranks of publications. Although this is good practice, it is important to note that these indicators are not appropriate for impact assessment, for which counterfactual evidence is necessary.

RECOMMENDATION 5-1: The National Center for Science and Engineering Statistics should expand its current set of bibliometric indicators to develop additional measures of knowledge flows and networking patterns. Data on both coauthorship and citations should be exploited to a greater extent than is currently the case.

BUSINESS R&D SERVICES AND INTANGIBLE ASSETS

Although NCSES publishes a rich set of data on R&D expenditures and performance, measures of spillover effects still are needed to aid in determining the effects of scientific investment on socioeconomic outcomes. Policy makers would benefit from such measures in addressing such questions as: What is the effect of federal spending on R&D on innovation and economic health, and over what time frame? What is the international balance of trade in R&D services? How much R&D do U.S. multinational companies conduct outside the United States, and how much R&D do foreign multinational companies carry out in the United States? How much are U.S. companies spending to be present in emerging markets? How much R&D are they conducting in these nations?

This section addresses the question of how R&D data can best be exploited, focusing in particular on the measurement of trade in R&D services. BRDIS contains a rich dataset on R&D that is only partially exploited in present indicators. Given the size and complexity of BRDIS, however, a tradeoff is entailed in terms of the time and resources needed to process these data. BRDIS can be exploited by researchers within and outside government, subject to appropriate restrictions to protect respondents, but only if a researcher database is provided with sufficient metadata20 to define all the variables and the degree of imputation for each.

At the same time, the panel acknowledges that further exploitation of BRDIS would require additional resources and might also involve a trade-off in terms of the timeliness of the release of key R&D indicators. The time required to process and release R&D statistics increased significantly following the introduction of BRDIS, which is a longer and more complex survey than its predecessor, the Survey of Industrial Research and Development. The panel views timeliness as an important factor in determining the value of R&D and other indicators and encourages NCSES to place high priority on reducing the time lag in the release of BRDIS data.

Trade in R&D Services21

One important aspect of R&D is R&D services, which are services for the performance of R&D provided by one organization for another. R&D services are for the most part provided by companies and organizations involved in biotechnology; contract research (including physical, engineering, and life sciences firms); and professional, scientific, and technical areas (including social sciences and humanities). These are companies or organizations categorized under NAICS code 5417 (scientific R&D services). Specifying NAICS codes for R&D services (as does BRDIS) is important, since firms in almost any industry can buy or sell R&D services. For example, Boeing can buy services to fill a gap in its R&D program for wing design; Walmart can sell its knowledge, based on R&D, on supply chains; and extraction firms can buy or sell R&D services related to extraction.

Currently, R&D services are captured through the use of a number of indicators published in the SEI. These include R&D by sector and location of performance, funding of R&D

____________________

19“The percentile of a publication is its relative position within the reference set—the higher the percentile rank, the more citations it has received compared with publications in the same subject category and publication year” (Bornmann and Marx, 2013, p. 2).

20Metadata describe the data and how they were constructed.

21“Services are the result of a production activity that changes the conditions of the consuming units, or facilitates the exchange of products or financial assets. These types of service may be described as change-effecting services and margin services respectively. Change-effecting services are outputs produced to order and typically consist of changes in the conditions of the consuming units realized by the activities of producers at the demand of the consumers. Change-effecting service are not separate entities over which ownership rights can be established. They cannot be traded separately from their production. By the time their production is completed, they must have been provided to the consumers” (European Commission, 2009, Chapter 6, paragraph 17).

Suggested Citation:"5 Measuring the Three K's: Knowledge Generation, Knowledge Networks, and Knowledge Flows." National Research Council. 2014. Capturing Change in Science, Technology, and Innovation: Improving Indicators to Inform Policy. Washington, DC: The National Academies Press. doi: 10.17226/18606.
×

by companies and others, R&D performed abroad by U.S.owned companies, R&D performed in the United States by foreign multinationals (foreign direct investment in R&D), and exports and imports of R&D and testing services. For the SEI, data on R&D performance and funding are taken from BRDIS, while the Bureau of Economic Analysis (BEA) provides the data on foreign direct investment in R&D and on international trade in R&D testing services.

NCSES is expanding its data-linking activities to match BRDIS microdata with BEA survey microdata on U.S. foreign direct investment. The agency also has undertaken fruitful interagency collaboration with BEA to integrate R&D into the system of national accounts.

The panel deliberated on globalization and its impact on the research enterprise in the United States. An immediate policy question was how much R&D, measured in terms of expenditures, was outsourced to countries such as Brazil, China, or India, and whether R&D was performed by foreign affiliates or purchased from other companies. A related question was how much knowledge produced by U.S. R&D is being purchased by other countries, and which countries are leading purchasers. These are important but also complex questions that present a number of difficult challenges for data collection.

The panel thus commissioned a paper on this subject by Sue Okubo (2012). The paper reviews the current work of BEA in this area and compares it with recent NCSES work on BRDIS.22 Several observations follow from this comparison.

One key observation in Okubo’s paper is the difference between the classifications used by BEA and NCSES and the fact that BEA measures trade in R&D and testing services, whereas NCSES in BRDIS measures R&D services only. While BEA and NCSES are cooperating on survey activity, the panel emphasizes the importance of this cooperation’s leading to comparability of the data produced by these and other agencies (see Recommendation 5-2 later in this section).

The surveys on international transactions administered by BEA and the R&D surveys23 carried out by NCSES follow different guidance: BEA follows the sixth edition of the International Monetary Fund’s (IMF) (2011) Balance of Payments and International Investment Position Manual, while NCSES follows the Frascati Manual (OECD, 2002a). However, the two approaches are not far apart. The IMF manual includes some R&D and intellectual property elements that are consistent with the Frascati Manual. Therefore, the geographic and ownership scope of BEA’s international transaction surveys and that of the BRDIS are conceptually close. For example, BEA’s international transaction surveys encompass any company with activities in the United States, regardless of ownership. The surveys cover transactions of U.S.-located units of foreign multinational enterprises with entities outside the United States, including transactions with their own foreign parents, and affiliated and unaffiliated trade. Similarly, for the United States, the surveys cover affiliated and unaffiliated trade and transactions by purely domestic companies (no relationship with any multinational enterprise). BRDIS also covers any company with activities in the United States, regardless of ownership, and foreign affiliates of U.S. multinational enterprises.

On the other hand, BRDIS treats foreign parent companies differently from the way they are treated in both BEA’s trade surveys and BEA’s surveys of foreign direct investment. Other differences exist between BRDIS and BEA data on the international balance of payments in R&D trade: BEA’s testing services, which are part of the research, development, and testing measure, may include R&D and non-R&D components, and R&D is treated by NCSES basically as a cost measure, while transactions are treated more like market values. Moris (2009, p. 184) suggests a matrix for use in parsing the data from BEA’s trade surveys and R&D surveys (including BRDIS).

A second key observation in Okubo’s paper relates to the results of the BEA surveys with respect to the sale of R&D and testing services abroad. For 2010, the largest buyers of U.S. R&D and testing services were Bermuda,24 Ireland, Japan, the Netherlands, and Switzerland, accounting for 6.6 percent of the total trade of $30.9 billion. Such a distribution of trade statistics is rare, as is illustrated by trade in professional, business, and technical (PBT) services. In 2010, the largest buyers of U.S. PBT services were Germany, Ireland, Japan, Switzerland, and the United Kingdom, accounting for 37 percent of total trade; the largest sellers of PBT services to the United States—the countries to which these services were outsourced—were Germany, India, Japan, the Netherlands, Switzerland, and the United Kingdom, which accounted for 40 percent of total U.S. payments for these services (Okubo, 2012). The dominance of the leading countries in the sale and purchase of PBT services is seen in other trade figures, but not in the sale and purchase of R&D and testing services. This difference in the concentration of R&D and testing services merits further analysis.

In summary, the questions that beg to be answered are: Under what circumstances does the R&D activity of multi-

____________________

22In September 2012, NCSES inaugurated a website with two new publications on the International Investment and R&D Data Link project. The site will also house future publications on the BRDIS link (National Science Foundation, 2013b). It should be noted that BEA plans to incorporate R&D as investment in the core economic accounts in 2014.

23The NCSES surveys referred to include BRDIS and its predecessor, the Survey of Industrial Research and Development.

24If one were to start with R&D performers only and then look at their R&D exports and imports, legitimate non-R&D performers that only import their R&D from overseas would be eliminated from the analysis. This exercise would require access to the microdata, which are not publicly available. However, NCSES could conduct this analysis and publish the statistics and rankings. There is no escape from accounting and transfer price issues, such as allocated costs that are not related to actual R&D trade. R&D performance data for multinational enterprises are not immune to this issue. Conditioning on performance for trade flows can eliminate unwanted R&D and training data.

Suggested Citation:"5 Measuring the Three K's: Knowledge Generation, Knowledge Networks, and Knowledge Flows." National Research Council. 2014. Capturing Change in Science, Technology, and Innovation: Improving Indicators to Inform Policy. Washington, DC: The National Academies Press. doi: 10.17226/18606.
×

national corporations enhance U.S. economic performance, including leadership and strength in S&T? What effect do tax laws have on the location of R&D services? Clearly, the R&D activity of multinational corporations has grown, but the data available with which to analyze and track this activity have limitations. BRDIS includes data on domestic and foreign activities of firms and can provide a more detailed picture of R&D activities than has previously been possible or been fully exploited. Specifically, BRDIS offers more information on R&D service production and flows of R&D services in the United States and in U.S. firms abroad than has heretofore been published. Understanding outsourcing and trade in R&D services is particularly important because the developed economies are dominated by service industries. BRDIS data also can support measures of payments and receipts for R&D services abroad, by leading countries, which is critically important for policy purposes.

RECOMMENDATION 5-2: The National Center for Science and Engineering Statistics (NCSES) should make greater use of data from its Business Research and Development and Innovation Survey to provide indicators of payments and receipts for research and development services purchased from and sold to other countries. For this purpose, NCSES should continue collaboration with the U.S. Bureau of Economic Analysis on the linked dataset.

The panel believes NCSES can provide these estimates and, if necessary, include appropriate questions on BRDIS in 2013 and subsequent years. The 2008, 2009, and 2010 BRDIS did not allow NCSES to collect all of the elements described above, but the 2011 and 2012 questionnaires are more comprehensive in this dimension, collecting data on R&D production, funding, and transactions. Data would be available with which to produce statistics on payments and receipts for R&D services involving U.S. company affiliates at home and abroad and on how those data differ, if at all, from the BEA measures. Similar information on foreign company affiliates from other sources could be used for parallel comparisons.25 NCSES could consider developing two series—payments and receipts for R&D services—for three to five leading countries. The resulting statistics would show what knowledge creation is being outsourced and which countries are buying U.S. knowledge. This information would enable users to track trends over time and have a better understanding of knowledge flows and the formation of R&D networks.

Over time, this exercise would provide answers to a range of questions: Is the United States losing or gaining advantage by buying and selling its R&D abroad? Is the United States benefiting from research conducted in other countries? What is the United States learning from other countries, and what are other countries learning from the United States? In what technological areas are other countries accelerating development using knowledge sourced in the United States? What is the role of multinational enterprises in transferring R&D capacity from country to country? The data could also be used in regression analysis to answer another important question: What impact does the international flow of R&D have on U.S. economic performance? Users of the data on international flows of R&D services are likely to be interested in seeing how emerging economies are advancing in R&D capacity, in what fields U.S. companies are sourcing or outsourcing R&D and whether it is increasingly being sourced or outsourced in specific countries, and which countries 5-10 years from now may be the hub of new scientific knowledge—possibly countries in Latin America, the Middle East, or sub-Saharan Africa.

Intangible Assets

Until recently, the important role of knowledge-based capital (KBC) was rarely recognized, one exception being Nakamura’s (1999) research on intangibles26 and the “New Economy.” This situation has changed primarily as a result of the pioneering research of Corrado and colleagues (2005) on intangibles. In their 2006 paper, these authors point out that most knowledge-based investment is excluded from measured GDP and from most productivity and economic growth models. The authors recognize three broad categories of KBC: computerized information (software and databases); innovative property (patents, copyrights, designs, trademarks); and economic competencies (including brand equity, firm-specific human capital, networks joining people and institutions, and organizational know-how that increases enterprise efficiency). Another important form of KBC is human capital that is not firm specific, such as most human capital that is created through education.27 The World Bank (1997) estimates that for most countries, intangibles, including human capital more broadly defined, represent the majority of a country’s wealth.28 By all accounts, failing to recognize KBC in any analysis of economic growth or the potential for innovation is a significant omission.

For this reason, a major development in the measurement of KBC occurred when the status of R&D was changed in the 2008 System of National Accounts (SNA) from an expense to an (intangible) capital investment. Efforts are still ongoing both in the United States (see, e.g., U.S. Bureau of Economic Analysis, 2010) and internationally to integrate R&D fully into national accounts. This work requires not only high-

____________________

25See, for example, Eurostat 2010 statistics (Eurostat, 2013). Also see statistics for Germany (Deutsche Bank Research, 2011) and on the Indian engineering R&D offshoring market (NASSCOM and Booz & Company, 2010). These two reports cite private company estimates, as well as published Eurostat statistics.

26 Part of the broad category of KBC; see, e.g., OECD (2012a).

27Human capital is discussed in Chapter 6.

28World Bank intangibles include human capital, the country’s infrastructure, social capital, and the returns from net foreign financial assets.

Suggested Citation:"5 Measuring the Three K's: Knowledge Generation, Knowledge Networks, and Knowledge Flows." National Research Council. 2014. Capturing Change in Science, Technology, and Innovation: Improving Indicators to Inform Policy. Washington, DC: The National Academies Press. doi: 10.17226/18606.
×

quality data on R&D, but also methods for estimating the depreciation of R&D capital, appropriate R&D deflators, and the estimation of price changes. Although the integration of R&D into the SNA is mainly the responsibility of BEA, NCSES has an important role through its long-standing expertise in the collection of R&D data.

The estimates of Corrado, Hulten, and Sichel for the United States give a sense of the relative importance of various components of KBC as defined above (Corrado et al., 2006). Almost 35 percent of their measured KBC either is currently in GDP (computer software) or is in GDP beginning with estimates for 2013 (mainly scientific R&D). Some data on nonscientific R&D (e.g., social science R&D) are now collected through National Science Foundation (NSF) surveys. Total nonscientific R&D is estimated by Corrado, Hulten, and Sichel to be in excess of 20 percent of total R&D. The largest portion of the unmeasured component, economic competencies, accounts for somewhat less than 40 percent of spending on business intangibles.

More than 70 percent of spending on economic competencies is for firm-specific resources. This spending includes employer-provided worker training and management time devoted to increasing firm productivity. Examples given for management time are time for strategic planning, adaptation, and reorganization. Corrado, Hulten, and Sichel used management consulting industry revenues, trends in compensation, and numbers of individuals in executive occupations to estimate spending in the management time category. Sixty percent of advertising expenditures is allocated to business spending on brand equity intangibles.29

A number of researchers have estimated KBC for individual countries following the lead of Corrado, Hulten, and Sichel. These individual countries include Australia (Barnes, 2010; Barnes and McClure, 2009), Canada (Baldwin et al., 2008), China (Hulten and Hao, 2012), Finland (Jalava et al., 2007), France and Germany (Delbecque and Bounfour, 2011), Japan (Fukao et al., 2007, 2009, 2012; Miyagawa and Hisa, 2012), the Netherlands (van Rooijen-Horsten et al., 2008), and the United Kingdom (Gil and Haskel, 2008; Marrano et al., 2009). Corrado and colleagues (2012) recently completed KBC estimates for the 27 EU countries and the United States. In addition, the methodology for estimating individual components of KGC has been refined, most notably by Gil and Haskell (2008).

A discussion paper by Corrado and colleagues (2012) provides the broadest view of the importance of KBC as it covers the largest number of countries.30 In their estimates, the United States stands out for two reasons as compared with regional EU country averages: it has the largest share of intangible investment in GDP (11 percent), and it is the only country/region for which intangible investment is a larger share of GDP than tangible investment. In all country/ regional comparisons, however, the rate of growth in intangible investment exceeds that in intangible investment. The authors report three main results. First, capital deepening is the dominant source of economic growth once intangibles are recognized. Second, deepening of intangible capital accounts for one-fifth to one-third of the growth of labor productivity. Finally, the contribution of intangible capital in some large European countries (e.g., Germany, Italy, and Spain) is lower than that in the United Kingdom and the United States. However, there are significant country differences in the distribution of intangibles by broad types: computerized information, innovative property, and economic competencies.

Aizcorbe and colleagues (2009) review various definitions of innovation; propose how measures of innovation like that addressed by Corrado, Hulten, and Sichel could be integrated into a satellite account; and outline future BEA plans. They note that whether advertising and marketing expenditures should be treated as investment is being debated. They question whether cumulating all firms’ advertising expenditures should be registered as increasing aggregate output. In addition, they comment on the difficulty of measuring spending on organizational change. As Corrado, Hulten, and Sichel also recognize, they describe how developing deflators and depreciation rates for most intangibles can be difficult. Their paper calls for cultivation of sources for spending on the development and implementation of new business models, the creation of new artistic originals (see below), the design of new products, and intermediate inputs to innovation. Finally, they hope to work toward better price and depreciation estimates and, in cooperation with the Census Bureau and NSF, the publication of firm and establishment innovation statistics.

Since Corrado, Hulten, and Sichel published their first paper on intangibles in 2005, U.S. government agencies have moved forward to measure and recognize intangibles more comprehensively. As mentioned above, efforts are under way to capitalize R&D and fully integrate it into the SNA. Investment in artistic originals is incorporated into U.S. GDP in 2013 (Aizcorbe et al., 2009).31 BEA-defined artistic originals include theatrical movies, original songs and recordings, original books, long-lived television programming, and miscellaneous artwork (Soloveichik, 2010a,b,c,d, 2011a,b). For many years, mineral exploration, a relatively small component, has been recognized as investment in U.S. GDP.

Many reports and monographs and at least one book have been produced on KBC. Many of them have been published since 2005. An interim project report from OECD (2012a)

____________________

29More information on how business spending in intangibles was estimated is available in Corrado et al. (2005).

30The years covered vary in Corrado et al. (2012): the earliest beginning year is 1995, and the latest is 2009. Regions include Scandinavian (Denmark, Finland, and Sweden), Anglosaxon (Ireland and the United Kingdom), Continental (Austria, Belgium, France, Germany, Luxembourg, and the Netherlands), and Mediterranean (Greece, Italy, Portugal, and Spain).

31See Chapter 7 of this report for more detail on how Aizcorbe and colleagues at BEA are using administrative records and web-based data in the agency’s project to capitalize intangible assets for inclusion in the SNA.

Suggested Citation:"5 Measuring the Three K's: Knowledge Generation, Knowledge Networks, and Knowledge Flows." National Research Council. 2014. Capturing Change in Science, Technology, and Innovation: Improving Indicators to Inform Policy. Washington, DC: The National Academies Press. doi: 10.17226/18606.
×

echoes the Corrado and colleagues (2012) conclusion that intangibles have been estimated to account for a substantial share of labor productivity: 20-25 percent across Europe and 27 percent in the United States. In addition, the OECD report notes that there are substantial spillovers from and repeated use of KBC, and that global competitiveness may increasingly be determined by KBC. After offering answers to the question of why business is investing in KBC, the OECD report focuses on policy questions. The policy challenges discussed with respect to KBC are in the areas of taxation, competition, intellectual property rights, personal data, and corporate reporting. Other publications focus on KBC more from an accounting or business perspective. Lev (2001) uses movements in stock market prices to estimate the impact and importance of intangibles. A long report by Stone and colleagues (2008), written from the business/accounting perspective, includes a long list of references. Among its contributions are a summary of efforts to measure firm- and aggregate-level innovation and a taxonomy of possible types of measures—indicator indices, monetary, and accounting. Many authors recognize the complexity of measuring and estimating the contribution of KBC to economic growth.

The potential definition of KBC is far broader then that employed by Corrado, Hulten, and Sichel. Aside from including all formal education, not just employer-provided training, Stone and colleagues (2008) cite two major categories—relational capital and open innovation. Relational capital refers to relationships with external stakeholders, including customers and suppliers. Its value can encompass the complementarity of user needs, such as customers and advertisers using Google for similar purposes. Companies that use open innovation post R&D and commercialization challenges on web-based forums or “marketplaces” that are accessible to communities of scientists, engineers, and entrepreneurs. A component of the Corrado, Hulten, and Sichel definition that is featured less prominently in related research, including that of Stone and colleagues (2008), is general networking. Stone and colleagues comment that general networking is particularly useful for businesses operating in emerging economies. Facebook provides a form of social capital/ networking that by extension has information and business value. Each of these expansions or extensions of the Corrado, Hulten, and Sichel definition of intangibles presents substantial measurement challenges.

As stated by Stone and colleagues (2008, p. II-4), “Intangible assets are not innovations, but they may lead to innovations.” And as stated by Ben Bernanke in the concluding sentence of a 2011 speech, “We will be more likely to promote innovative activity if we are able to measure it more effectively and document its role in economic growth.” The open question, however, is which KBC leads to economic growth and to what degree, and is this part of the challenge of making a direct and quantifiable connection between innovative activity and economic growth? Certainly some components of KBC have been studied extensively to document their role; scientific R&D is the prime example. Other components of KBC have been less well studied; organizational know-how is an example. The importance of KBC as an STI indicator depends on the drawing of connections. However, it is critical to recognize both KBC and tangible capital as factors that may be important indicators of future growth. Although the panel believes work on intangible assets may generate useful STI indicators, it believes NCSES should not seek to produce these statistics on its own, but support and draw on the work of other agencies, particularly BEA, in this area. However, NCSES still has an important role to play through its collection of high-quality R&D data, and it may also be able to contribute with other data sources. This might be the case, for example, if NCSES were to begin collecting data on innovation-related expenditures, as outlined in Chapter 4.

RECOMMENDATION 5-3: The National Center for Science and Engineering Statistics (NCSES) should continue to report statistics on knowledge-based capital and intangible assets obtained from other agencies as part of its data repository function. In addition, NCSES should seek to use data from the Business Research and Development and Innovation Survey on research and development and potentially also on innovation-related expenditures as valuable inputs to ongoing work in this area.

Indicators of General-Purpose Technologies

“General-purpose technology” (Lipsey et al., 2005) is a term used to describe technologies with the potential to transform the economy and activities across a broad range of sectors and industries (Jovanovic and Rousseau, 2005). Earlier examples are steam, electricity, and internal combustion, while more recent examples include information and communication technologies, biotechnology, nanotechnology, and green technologies. Given their potential importance for innovation and growth, tracking the development of these technologies and their diffusion and application is important to inform policy. In this area, there is one particular policy question that users of STI indicators are eager to have answered: Is the United States promoting platforms in information and communication technology, biotechnology, and other technologies to enable innovation in applications?

Bresnahan and Trajtenberg (1995) outline three characteristics of general-purpose technologies: their pervasiveness across sectors, their development and improvement over time, and their ability to spur innovation in their own and other sectors. These characteristics are useful for guiding the measurement of general-purpose technologies. Tracking knowledge generation in these technologies, their diffusion to other sectors, and the linkages among them is important for understanding innovation and other sources of growth in the economy.

Suggested Citation:"5 Measuring the Three K's: Knowledge Generation, Knowledge Networks, and Knowledge Flows." National Research Council. 2014. Capturing Change in Science, Technology, and Innovation: Improving Indicators to Inform Policy. Washington, DC: The National Academies Press. doi: 10.17226/18606.
×

Measuring general-purpose technologies poses two main difficulties. The first is that not all of these technologies can be properly identified as belonging to a particular sector, because they are spread across different industry classifications. The second difficulty arises in identifying the use of these technologies in other sectors. Clearly, the extent of these difficulties varies according to each such technology. Information and communication technology is by far the best covered in statistics in terms of both industry classification and identification of investments in other sectors.

A number of the data sources discussed in this chapter can be used to generate indicators of general-purpose technologies. For example, patents and trademarks can be used to measure the use of such technologies for knowledge creation in sectors other than those in which they were developed, and both patent and bibliometric data can be used to measure the linkages among general-purpose technology sectors. R&D data provide an indicator of knowledge generation in sectors that develop general-purpose technologies, as do broader measures of investment in these technologies. In addition, the BRDIS contains data on the percentage of R&D in energy applications, environmental protection applications, software, medical clinical trials, biotechnology, and nanotechnology. These data can potentially be used to investigate the extent of R&D in these technologies across sectors (thus giving a picture of how “general-purpose” these technologies are).

NCSES currently publishes a number of statistics on general-purpose technologies—particularly for information and communication technology, but increasingly also for green technologies. The panel encourages NCSES to continue this work and also to build on current indicators in this area. In particular, NCSES should examine possibilities for better coverage of the diffusion and uptake of general-purpose technologies in sectors other than those in which they were developed, using both BRDIS and other data sources.

RECOMMENDATION 5-4: The National Center for Science and Engineering Statistics (NCSES) should develop a suite of indicators that can be used to track the development and diffusion of general-purpose technologies, including information and communication technologies, biotechnology, nanotechnology, and green technologies. NCSES should attempt to make greater use of data from the Business Research and Development and Innovation Survey for this purpose while also exploring the use of other sources, such as patent and bibliometric data.

Subnational Issues in Measuring New Knowledge and Knowledge Networks

Compared with the measurement of innovation, the measurement of knowledge production is more clearly connected to geographic location. A number of initiatives by successive administrations have emphasized the ability to locate federal research grants on S&T down to very detailed levels within neighborhoods. Of course, some of this detail is spurious. Establishment data may link to some postal address while the actual economic activity is being carried out over a territory of some size, depending on the industry. Moreover, much of the value derived from these targeted investments comes from the trade of goods and services, which is dispersed geographically.

Still, disaggregating is certainly possible to levels much finer than the states. For example, universities are well-behaved geographic phenomena in that they remain in one place, and their relation to various nested administrative hierarchies is straightforward. Their laboratories and research facilities are similar to those of other establishments; in fact, some of them resemble industrial facilities with loading docks, employees, and so on. The movement of goods and people in the university establishment can be accounted for in the production of scientific work.

Some success appears to have been achieved in gathering data on some of the basic science output tied to spatial units. Geographic identifiers appear in many contexts, including author lists of publications and patent applications. With some care, and some level of error, these outputs can be linked to a location. But difficulties are entailed in measuring the impacts of research investments, particularly with spatial disaggregation. Particularly challenging to measure is the geographic instantiation of a knowledge network and the flows of knowledge from place to place.

As a reference for understanding the national system of R&D, it may be worthwhile to examine the results of a major study conducted in Canada in 2011 (Jenkins et al., 2011). A six-member expert panel carried out a full review of the country’s federal R&D support programs. While one important theme concerned Canada’s balance between tax credits and direct R&D support, the authors’ comprehensive study of the whole system of R&D support programs bears examination for application to the United States. The Canadian panel surveyed more than 60 institutes and programs engaged in supporting business innovation. The distribution was highly skewed, with a few relatively large entities and many small ones. Because each was created under a distinct charter, there is little coherence in the criteria used to evaluate effectiveness, a common problem worldwide. The tendency, as in other countries, is to concentrate on generating more investment in R&D rather than on providing mechanisms for industry to obtain the assistance needed to overcome current problems in operations. Certain gaps also became evident from this comprehensive analysis, leading the Canadian panel to offer recommendations for short-term measures to improve the effectiveness of the country’s innovation system. The Canadian panel notes that the responsibility for fostering innovation cuts across many functions of government and therefore requires a system-wide perspective and whole-of-government priority. That panel’s recommendations include

Suggested Citation:"5 Measuring the Three K's: Knowledge Generation, Knowledge Networks, and Knowledge Flows." National Research Council. 2014. Capturing Change in Science, Technology, and Innovation: Improving Indicators to Inform Policy. Washington, DC: The National Academies Press. doi: 10.17226/18606.
×

making encouragement of innovation in the Canadian economy a stated objective of federal procurement policies and programs, restructuring procurement procedures to allow more latitude for innovative solutions to emerge, and reorienting existing federal laboratories to serve sectoral needs.32

The important message in the present context is that certain aspects of the innovation system emerge from a comprehensive view of the whole. Canada invested the efforts of a distinguished panel in such a process, with clear results for managing its system. That panel’s analysis also raised the question of how to compare existing R&D support programs. Although NCSES, as a statistical office, does not conduct evaluation, it should be in a position to provide information on government programs that would support the evaluation done by others.

SUMMARY

In this chapter, the panel has offered four recommendations regarding the development of indicators of knowledge generation, knowledge networks, and knowledge flows. The focus is on techniques that should be used to develop indicators that users want for specific market sectors and that improve the international comparability of the data. The panel also suggests that the production of certain measures is not in NCSES’s purview, and these measures should instead be acquired from other agencies. In the near term, NCSES should give priority to using tools that are readily available at the agency and continuing existing collaborations with other agencies while developing new techniques and cultivating new linkages over time.

____________________

32The report lists the following sectors (p. 3-13): Goods Industries (agriculture, forestry, fishing and hunting; manufacturing; construction; utilities; and oil and gas and mining); Services Industries (transportation and warehousing; information and cultural industries; wholesale trade; retail trade; finance and insurance, real estate and rental and leasing; professional, scientific, and technical services; and other services); and Unclassified Industries.

Suggested Citation:"5 Measuring the Three K's: Knowledge Generation, Knowledge Networks, and Knowledge Flows." National Research Council. 2014. Capturing Change in Science, Technology, and Innovation: Improving Indicators to Inform Policy. Washington, DC: The National Academies Press. doi: 10.17226/18606.
×

This page intentionally left blank.

Suggested Citation:"5 Measuring the Three K's: Knowledge Generation, Knowledge Networks, and Knowledge Flows." National Research Council. 2014. Capturing Change in Science, Technology, and Innovation: Improving Indicators to Inform Policy. Washington, DC: The National Academies Press. doi: 10.17226/18606.
×
Page 59
Suggested Citation:"5 Measuring the Three K's: Knowledge Generation, Knowledge Networks, and Knowledge Flows." National Research Council. 2014. Capturing Change in Science, Technology, and Innovation: Improving Indicators to Inform Policy. Washington, DC: The National Academies Press. doi: 10.17226/18606.
×
Page 60
Suggested Citation:"5 Measuring the Three K's: Knowledge Generation, Knowledge Networks, and Knowledge Flows." National Research Council. 2014. Capturing Change in Science, Technology, and Innovation: Improving Indicators to Inform Policy. Washington, DC: The National Academies Press. doi: 10.17226/18606.
×
Page 61
Suggested Citation:"5 Measuring the Three K's: Knowledge Generation, Knowledge Networks, and Knowledge Flows." National Research Council. 2014. Capturing Change in Science, Technology, and Innovation: Improving Indicators to Inform Policy. Washington, DC: The National Academies Press. doi: 10.17226/18606.
×
Page 62
Suggested Citation:"5 Measuring the Three K's: Knowledge Generation, Knowledge Networks, and Knowledge Flows." National Research Council. 2014. Capturing Change in Science, Technology, and Innovation: Improving Indicators to Inform Policy. Washington, DC: The National Academies Press. doi: 10.17226/18606.
×
Page 63
Suggested Citation:"5 Measuring the Three K's: Knowledge Generation, Knowledge Networks, and Knowledge Flows." National Research Council. 2014. Capturing Change in Science, Technology, and Innovation: Improving Indicators to Inform Policy. Washington, DC: The National Academies Press. doi: 10.17226/18606.
×
Page 64
Suggested Citation:"5 Measuring the Three K's: Knowledge Generation, Knowledge Networks, and Knowledge Flows." National Research Council. 2014. Capturing Change in Science, Technology, and Innovation: Improving Indicators to Inform Policy. Washington, DC: The National Academies Press. doi: 10.17226/18606.
×
Page 65
Suggested Citation:"5 Measuring the Three K's: Knowledge Generation, Knowledge Networks, and Knowledge Flows." National Research Council. 2014. Capturing Change in Science, Technology, and Innovation: Improving Indicators to Inform Policy. Washington, DC: The National Academies Press. doi: 10.17226/18606.
×
Page 66
Suggested Citation:"5 Measuring the Three K's: Knowledge Generation, Knowledge Networks, and Knowledge Flows." National Research Council. 2014. Capturing Change in Science, Technology, and Innovation: Improving Indicators to Inform Policy. Washington, DC: The National Academies Press. doi: 10.17226/18606.
×
Page 67
Suggested Citation:"5 Measuring the Three K's: Knowledge Generation, Knowledge Networks, and Knowledge Flows." National Research Council. 2014. Capturing Change in Science, Technology, and Innovation: Improving Indicators to Inform Policy. Washington, DC: The National Academies Press. doi: 10.17226/18606.
×
Page 68
Suggested Citation:"5 Measuring the Three K's: Knowledge Generation, Knowledge Networks, and Knowledge Flows." National Research Council. 2014. Capturing Change in Science, Technology, and Innovation: Improving Indicators to Inform Policy. Washington, DC: The National Academies Press. doi: 10.17226/18606.
×
Page 69
Suggested Citation:"5 Measuring the Three K's: Knowledge Generation, Knowledge Networks, and Knowledge Flows." National Research Council. 2014. Capturing Change in Science, Technology, and Innovation: Improving Indicators to Inform Policy. Washington, DC: The National Academies Press. doi: 10.17226/18606.
×
Page 70
Suggested Citation:"5 Measuring the Three K's: Knowledge Generation, Knowledge Networks, and Knowledge Flows." National Research Council. 2014. Capturing Change in Science, Technology, and Innovation: Improving Indicators to Inform Policy. Washington, DC: The National Academies Press. doi: 10.17226/18606.
×
Page 71
Suggested Citation:"5 Measuring the Three K's: Knowledge Generation, Knowledge Networks, and Knowledge Flows." National Research Council. 2014. Capturing Change in Science, Technology, and Innovation: Improving Indicators to Inform Policy. Washington, DC: The National Academies Press. doi: 10.17226/18606.
×
Page 72
Next: 6 Measuring Human Capital »
Capturing Change in Science, Technology, and Innovation: Improving Indicators to Inform Policy Get This Book
×
 Capturing Change in Science, Technology, and Innovation: Improving Indicators to Inform Policy
Buy Paperback | $64.00 Buy Ebook | $49.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Since the 1950s, under congressional mandate, the U.S. National Science Foundation (NSF) - through its National Center for Science and Engineering Statistics (NCSES) and predecessor agencies - has produced regularly updated measures of research and development expenditures, employment and training in science and engineering, and other indicators of the state of U.S. science and technology. A more recent focus has been on measuring innovation in the corporate sector. NCSES collects its own data on science, technology, and innovation (STI) activities and also incorporates data from other agencies to produce indicators that are used for monitoring purposes - including comparisons among sectors, regions, and with other countries - and for identifying trends that may require policy attention and generate research needs. NCSES also provides extensive tabulations and microdata files for in-depth analysis.

Capturing Change in Science, Technology, and Innovation assesses and provides recommendations regarding the need for revised, refocused, and newly developed indicators of STI activities that would enable NCSES to respond to changing policy concerns. This report also identifies and assesses both existing and potential data resources and tools that NCSES could exploit to further develop its indicators program. Finally, the report considers strategic pathways for NCSES to move forward with an improved STI indicators program. The recommendations offered in Capturing Change in Science, Technology, and Innovation are intended to serve as the basis for a strategic program of work that will enhance NCSES's ability to produce indicators that capture change in science, technology, and innovation to inform policy and optimally meet the needs of its user community.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!