National Academies Press: OpenBook

Quantitative Assessments of the Physical and Mathematical Sciences: A Summary of the Lessons Learned (1994)

Chapter: Addititional Issues on Conducting Quantitative Assessments

« Previous: Summary of General Findings on Potential Indicators and Related Data
Suggested Citation:"Addititional Issues on Conducting Quantitative Assessments." National Research Council. 1994. Quantitative Assessments of the Physical and Mathematical Sciences: A Summary of the Lessons Learned. Washington, DC: The National Academies Press. doi: 10.17226/9199.
×

federal agency, especially from recognized leaders in the field, would be a reasonable quantitative measure of this kind of adaptive response. However, it may not always be clear whether such a response reflects recognition of a valuable scientific direction by the intellectual leaders in the field, or whether it is a response forced by the economic conditions of support in that field. In addition, scientists with existing grants may propose continuations of current research under relabeled, politically fashionable titles.

It is tempting to ascribe a positive value to the speed and extent of an adaptive response, perhaps because of the tendency in our society to equate lack of change with monotony and stagnation. Since change is undeniably exciting, it is often regarded as being desirable and laudable, and the speed and extent of change are generally thought to indicate vigor. Rapid change, however, can also indicate immaturity or instability. If long-term scientific directions that have been motivated by sound historical and philosophical imperatives are neglected as a result of mass gravitation toward funding opportunities, the intellectual vigor of the discipline may in fact be questionable. Thus any interpretation of numerical metrics of adaptability must take into account value judgments of the sort indicated above. Indeed, there are no objective metrics, as the discussion up to this point has amply demonstrated.

One way to determine adaptability is to review the results of a discipline 's research. For the identification and cultivation of major scientific opportunities, however, a span of 10 years is barely sufficient in the context of basic research. Nevertheless, the response of professionals in the discipline regarding important scientific developments in the preceding N years could be an important indicator, although one that is not easily quantified by a statistical index. Determining the fraction of principal investigators who receive support from more than one mission agency might be useful. Finally, the number of new topical groups in scientific societies and the number of new topical meetings scheduled could be indicative of a discipline's adaptability, as well as growth. These indicators can be reviewed across fields and within fields.

ADDITIONAL ISSUES IN CONDUCTING QUANTITATIVE ASSESSMENTS

Resources

There are two resource issues associated with a quantitative assessment —the requisite expertise to conduct such an assessment and adequate funding. Although these requirements also arise in a peer review assessment, they are amplified when the analysis involves the acquisition and interpretation of diverse data from numerous sources.

In the past, the typical NRC discipline assessment has relied exclusively or predominantly on experts working in the major subdisciplines that constitute the discipline who are broadly knowledgeable of the major issues and opportunities. Despite the danger of such studies being perceived as self-serving or espousing parochial interests, it is important for practitioners to assess their own discipline. They are generally in the best position to identify the most pressing problems or most promising opportunities, as well as to suggest appropriate remedial or supportive actions.

Nevertheless, when collection and analysis of diverse data are integral to the study, the addition of several researchers from related or complementary disciplines, including, for example, a statistician, an historian or sociologist of science (or both), and one or two economists specializing in human resources and econometrics related

Suggested Citation:"Addititional Issues on Conducting Quantitative Assessments." National Research Council. 1994. Quantitative Assessments of the Physical and Mathematical Sciences: A Summary of the Lessons Learned. Washington, DC: The National Academies Press. doi: 10.17226/9199.
×

to the R&D process, could be useful. Such individuals might make a valuable contribution to appropriately structuring the study, interpreting the data, and broadening the analysis.

A quantitative approach to studying a discipline also requires adequate funding to cover the requisite expertise and additional staff time. If the analysis relies solely on existing data sources, the costs will be significantly less than if it includes the collection of new data through a statistical sampling, a formal survey, or the purchase of customized data runs from commercial data sources. The costs associated with generating new data can in fact be prohibitive, and spending large amounts of money on data for assessments may not be either cost-effective or necessary.

Issues in Acquiring and Interpreting Data

As has already been noted, there are significant problems associated with the use of existing data sources in performing a credible quantitative analysis and drawing an accurate statistical portrait of a discipline. Although each of the three panels experienced some unique difficulties in obtaining and analyzing the various data, the problems they encountered can be categorized according to the availability, quality, and comparability of the data.

Availability

As might be expected, certain classes of data are maintained on a regular basis and are made more easily available than others. For instance, the biennial demographic statistics on Ph.D. degree recipients collected by the NRC's Office of Scientific and Engineering Personnel (OSEP) and NSF are readily accessible and can yield useful information. Some professional societies conduct yearly or periodic membership surveys, although these surveys are not necessarily representative. Other consistently maintained data of potential relevance are collected by the Bureau of Labor Statistics, the National Center for Education Statistics, and the Census Bureau.

Because the major sources of long-term demographic data on science and engineering personnel, the OSEP Survey of Doctorate Recipients and Survey of Earned Doctorates, as well as the NSF's Science and Engineering Indicators and the agency's more specialized demographic surveys, are 1 to 3 years old at any given time, these sources are not very useful for detecting current status and near-term trends. This time lag in demographic data series presents a significant limitation on the potential uses of such data either for analyzing the existing state of the field or for planning purposes. Nevertheless, for some small and well-defined fields such as astronomy, or for well-monitored fields such as the mathematical sciences, more up-to-date statistics can be found in other sources, for example, in the case of astronomy, the monthly listing of available positions in the American Astronomical Society's job register.

Most of the available demographic data cover only the United States. Those data that are available for other countries use different definitions, categorizations, and survey methodologies, making any meaningful international comparisons very difficult.

Moreover, except for the data on human resources, the available data are spotty at best. Detailed funding records from the agencies are often hard to obtain or are not available. Although NSF maintains data on funding for all agencies according to discipline, such data are sometimes difficult to interpret and are often too aggregated to use for subdisciplines. In addition, the sample sizes are typically too small to

Suggested Citation:"Addititional Issues on Conducting Quantitative Assessments." National Research Council. 1994. Quantitative Assessments of the Physical and Mathematical Sciences: A Summary of the Lessons Learned. Washington, DC: The National Academies Press. doi: 10.17226/9199.
×

permit meaningful disaggregation. Funding data for other nations are even scarcer, and the levels of expenditures related to industry or military programs are generally unavailable because of proprietary or national security considerations. Data indicating output or impact also are not broadly available, especially since there is disagreement as to what constitutes output, how it should be measured, and what any given measurements might tell us.

A final issue is the paucity of data available for narrowly defined fields or new subfields. Most surveys look only at large fields, precluding the possibility of disaggregating the results any further. The lack of detail in available data proved to be a major problem in looking at the interdisciplinary subfield of AMO science, for example. In addition, even the data with some degree of detail presented significant difficulties. For instance, the best available database on science and engineering personnel, the two OSEP surveys, has subcategories for atomic and molecular physics and for optics, but both subcategories are listed under physics only, although a substantial amount of the research in AMO science is more properly categorized under chemistry or engineering. In addition, because of the small number of individuals in the existing subcategories, any further disaggregation based on demographic characteristics such as gender, race, and age, among many others, becomes statistically unreliable. This is true even for a well-defined, but relatively small, field such as astronomy. The small numbers of scientists in such fields make it possible to conduct a customized survey, however.

Quality

The quality of existing data is no less a problem than is availability. Although the statistical compilations of the NSF and OSEP, and other commercially available data sets reviewed by the panels, appeared to conform to rigorous statistical methods and their limitations and uncertainties were well documented, not all sources are equally reliable. Determining the accuracy of all sources often is not possible.

Comparability

The problems associated with even good-quality data are compounded with every new source that is added, because each data set has its own scope, definitions, and biases. In the case of time series, this difficulty is further complicated by the unique circumstances—both internal and external—of each data collection point. Moreover, long time series such as the OSEP and NSF demographic data, which typically go back two or three decades and are the most useful sources for long-term retrospective analyses, reflect an inherent paradox. On the one hand, categories chosen initially in an attempt to maintain a consistent set of discipline and subdiscipline categories for the purpose of comparison over the years will inevitably atrophy and be supplanted by newly emerging areas of inquiry. This tendency, of course, is most likely to be a problem when the focus is the most dynamic and rapidly changing disciplines, which arguably are the ones about which we need the most information. In the case of the OSEP and NSF surveys that break major fields down into a handful of subfields, the tendency toward dynamic change, and the difficulty of accurately characterizing it, are evidenced by the great increase of survey responses that fall into the “general” or “other” subcategories. On the other hand, periodic adjustments made in the survey methodology to keep the data currently valid lead to statistical inconsistencies or

Suggested Citation:"Addititional Issues on Conducting Quantitative Assessments." National Research Council. 1994. Quantitative Assessments of the Physical and Mathematical Sciences: A Summary of the Lessons Learned. Washington, DC: The National Academies Press. doi: 10.17226/9199.
×
Page 10
Suggested Citation:"Addititional Issues on Conducting Quantitative Assessments." National Research Council. 1994. Quantitative Assessments of the Physical and Mathematical Sciences: A Summary of the Lessons Learned. Washington, DC: The National Academies Press. doi: 10.17226/9199.
×
Page 11
Suggested Citation:"Addititional Issues on Conducting Quantitative Assessments." National Research Council. 1994. Quantitative Assessments of the Physical and Mathematical Sciences: A Summary of the Lessons Learned. Washington, DC: The National Academies Press. doi: 10.17226/9199.
×
Page 12
Next: Concluding Observations and References »
Quantitative Assessments of the Physical and Mathematical Sciences: A Summary of the Lessons Learned Get This Book
×
MyNAP members save 10% online.
Login or Register to save!
  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!