Skip to main content

Currently Skimming:

7 Innovation Measurement Agendas of the Future
Pages 89-98

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 89...
... During this session, presenters continued discussing several of the data-related questions raised at various points during the workshop: To what extent can the digital revolution transform metrics in the area of innovation measurement? What are the roles of specialized surveys in innovative data collection?
From page 90...
... It allows them to avoid having to reinvent the wheel, Stern noted. The authors identified an approach using citations data to track the diffusion of knowledge from scientific papers placed into an institutional environment that promotes cumulativeness -- a biological resource center -- compared with similar articles that remained in a less open system.
From page 91...
... Her results suggest an approximately 30 percent reduction in subsequent publications, phenotype-genotype linkages, and diagnostic tests for genes first sequenced by Celera -- the company's licensing rights impacted cumulative knowledge occurring at the level of a research community. Turning to the topic of uncertainty, and the highly skewed nature of knowledge creation and innovation, Stern highlighted three points.
From page 92...
... This is analogous to the idea behind the biological resource centers that Stern discussed. The key to increasing the value of patent data is to create the capacity to link them to a web of other innovative indicators, Oldham said.
From page 93...
... A wide range of patent statistics can be added. Researchers can also compute their own patent statistics using simple structured query language (SQL)
From page 94...
... He also noted the promise of remote access protocols whereby certified individuals from within an organization run programs and deliver the results to external researchers, or that researchers could compute over particular fields or experiment with a sample. Sallie Keller (Social and Decision Analytics Laboratory)
From page 95...
... Program underlying the Quarterly Workforce Indicators. Linking these data, analysts at the Census Bureau are able to identify individuals and link them to records in the LEHD infrastructure, which includes data on all jobs in the economy covered by state unemployment insurance programs.
From page 96...
... LEVERAGING NONSURVEY, LOCAL DATA SOURCES Sallie Keller (Social and Decision Analytics Laboratory) pulled together a number of discussion threads developed over the course of the workshop about combining alternative types of data to analyze and understand society.
From page 97...
... The challenge with using such a diverse data landscape is data quality -- both within and across buckets. The traditional approach in science involves controlling measurement processes, whether with a survey or a physical bench lab instrument.
From page 98...
... Finally, a disciplined, yet flexible and adaptable, data framework is needed to assess data quality and fitness-for-use. Keller said she envisions that analytic use of granular data from multiple sources, including data quality assessments, can be done in a disciplined way such that students in 10 years will have an approach for administrative, opportunity, and procedural data that is similarly rigorous to that which is currently in place for survey data.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.