Appendix D
BIBLIOMETRIC ANALYSIS

Data Sources and Definitions

Journal Sets

This study analyzes data from an NIH data set established in 1981 and based on the approximately 1,000 journals that are indexed by both the National Library of Medicine’s Medline and the Science Citation Index (SCI). The data are both valid and consistent over time, and cover almost all NIH intramural research publication activity from 1973 to 1984. Although this data base does not permit comparison of NIH intramural and extramural research support activities, it offers opportunity for comparison with all publications whose authors’ addresses were given as universities or medical schools. Data have been accumulated into three four-year sets so that sufficient numbers of publications would be included to provide reliable comparisons.

Publication Counts

The number of papers assigned is based on fractional counts of author institutions. For example, if an NIH intramural scientist co-authors a paper with two authors in two different universities, each institution receives credit for 0.33 publication.

Activity Index

The Activity Index measures how active the intramural program is in a given field by dividing the percent of intramural papers in a field or subfield by the percent of all papers in that subfield.

Citation Frequency

Each time a paper in a data set is cited by the author(s) of subsequent papers in the same or any other covered journal, the article is credited. Older articles have time to accumulate more citations and are not, therefore, directly comparable with more recent papers. For the present analysis, a ratio has been calculated for each of the three four-year accumulations of papers: the average number of citations per paper received by intramural papers has been divided by the average number of citations per paper received by those authored by university and medical school investigators. These ratios permit comparison of intramural with academic average citation experience over time.



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 156
APPENDIX D 156 About this PDF file: This new digital representation of the original work has been recomposed from XML files created from the original paper book, not from the original typesetting files. Page breaks are true to the original; line lengths, word breaks, heading styles, and other typesetting-specific formatting, however, cannot be retained, Appendix D and some typographic errors may have been accidentally inserted. Please use the print version of this publication as the authoritative version for attribution. BIBLIOMETRIC ANALYSIS DATA SOURCES AND DEFINITIONS Journal Sets This study analyzes data from an NIH data set established in 1981 and based on the approximately 1,000 journals that are indexed by both the National Library of Medicine’s Medline and the Science Citation Index (SCI). The data are both valid and consistent over time, and cover almost all NIH intramural research publication activity from 1973 to 1984. Although this data base does not permit comparison of NIH intramural and extramural research support activities, it offers opportunity for comparison with all publications whose authors’ addresses were given as universities or medical schools. Data have been accumulated into three four-year sets so that sufficient numbers of publications would be included to provide reliable comparisons. Publication Counts The number of papers assigned is based on fractional counts of author institutions. For example, if an NIH intramural scientist co-authors a paper with two authors in two different universities, each institution receives credit for 0.33 publication. Activity Index The Activity Index measures how active the intramural program is in a given field by dividing the percent of intramural papers in a field or subfield by the percent of all papers in that subfield. Citation Frequency Each time a paper in a data set is cited by the author(s) of subsequent papers in the same or any other covered journal, the article is credited. Older articles have time to accumulate more citations and are not, therefore, directly comparable with more recent papers. For the present analysis, a ratio has been calculated for each of the three four-year accumulations of papers: the average number of citations per paper received by intramural papers has been divided by the average number of citations per paper received by those authored by university and medical school investigators. These ratios permit comparison of intramural with academic average citation experience over time.

OCR for page 156
APPENDIX D 157 About this PDF file: This new digital representation of the original work has been recomposed from XML files created from the original paper book, not from the original typesetting files. Page breaks are true to the original; line lengths, word breaks, heading styles, and other typesetting-specific formatting, however, cannot be retained, Field Coverage All journals included in the SCI data bases have been classified by Computer and some typographic errors may have been accidentally inserted. Please use the print version of this publication as the authoritative version for attribution. Horizons, Inc. (CHI) into “Fields” and “Subfields” (or disciplines). A subfield may include one or more disciplines; the decision by CHI to combine disciplines was based on size considerations or on the fact that some journals publish papers in two or more related disciplines. Attention was focused primarily on subfields included in the fields of clinical medicine and biomedical research. The revolution in basic biomedical science methods that has occurred since the Watson-Crick elucidation of the roles of DNA and RNA in the biological and biomedical sciences has blurred distinctions, even among the subfields designated “Biomedical Research.” The distinctions and the titles are retained for convenience in analysis and discussion. FINDINGS Findings from the bibliometric analysis are summarized in Chapter 2. Details of the analysis of intramural program performance in scientific subfields follow. There are 27 specific clinical and basic biomedical subfields in which the NIH intramural program can be considered most active: (a) activity indexes were greater than 1.0 and more than 40 papers were produced during the 1981–1984 period, or (b) more than 100 papers were produced though activity indexes were less than 1.0 (radiology-nuclear medicine and surgery). Eighteen of these 27 subfields produced more papers during 1981–1984 than during 1977–1980. Of these, 11 can be regarded as bedrocks of strength in the intramural program. Included are eight in clinical medicine—dermatology-venereal disease, endocrinology, hematology, immunology, pathology, radiology-nuclear medicine, respiratory system, and surgery; they are joined by three basic biomedical subfields—biochemistry-molecular biology, cell biology-cytology-histology, and parasitology. This group contains the two largest intramural subfields, in which the largest numbers of NIH papers are published (biochemistry-molecular biology, and immunology) and 4 that barely met criteria for inclusion among the 27 most active subfields—dermatology and respiratory research (barely enough papers) and radiology-nuclear medicine and surgery (low activity indexes). These 11 subfields have been singled out because, over the entire period 1973– 1984, they have exhibited stability or increasing strength in quality measures as well as in productivity. Some, such as dermatology, respiratory system, and parasitology, have exceptionally large percentages of papers among the most highly cited 10 percent (32 to 44 percent). It should be noted, however, that when numbers are small, as they are in two of these subfields, only a few outstanding performances are required to achieve sizable percentage records. For the very large subfields, such as endocrinology, immunology, biochemistry-molecular biology, and cell biology- cytology-histology, the opposite is true. Activity indexes for

OCR for page 156
APPENDIX D 158 About this PDF file: This new digital representation of the original work has been recomposed from XML files created from the original paper book, not from the original typesetting files. Page breaks are true to the original; line lengths, word breaks, heading styles, and other typesetting-specific formatting, however, cannot be retained, these subfields range from 2.3 to 4.3, and their papers play a significant part in defining the top 10 percent, yet more than 20 percent of endocrinology and and some typographic errors may have been accidentally inserted. Please use the print version of this publication as the authoritative version for attribution. immunology papers were in that highest decile in 1981–1984. Both journal influence and citation measures of the cell biology group have improved throughout the 1973– 1984 period; top citation decile papers increased from 12 to almost 18 percent. The largest intramural subfield, biochemistry-molecular biology, despite its size has maintained a strong record of performance in general—one far superior to that of the university-medical school record (15.2–16.2 papers in the top citation decile as compared with 9.7–9.9). Five subfields that have significantly increased the numbers of papers produced have been less fortunate in maintaining quality. Cardiovascular system, dentistry, pharmacology, physiology, and microbiology each have suffered declines in at least two of the three quality measures employed—average journal influence, ratio of intramural citations per paper to academic citations per paper, and percent of papers in the top citation decile. In all of these subfields, the number of papers in the top citation decile declined, and in all except physiology, citation per paper ratios also declined. The average journal influence rating declined precipitously for physiology papers, and markedly for all others except dentistry. These results suggest that problems may have been encountered in attracting high-level new junior talent, as well as in retaining or replacing leadership. Two additional subfields that have steadily increased the number of papers produced, but do not fit in either of the categories above, fertility and neurology- neurosurgery, have stable but unimpressive records on quality measures. Average journal influence measures are approximately the same as the academic averages, and citation measures are only slightly above them. Six major subfields that substantially increased their production of research papers during the 1977–1980 period but produced fewer papers during the 1981–1984 period are general and internal medicine, embryology, hygiene-public health, biophysics, tropical medicine, and virology. In all of these subfields except embryology, average journal influence ratings remained higher than those in the corresponding academic sector (substantially higher for general and internal medicine). Citations per paper ratios were fairly stable, but for each of these subfields the percent of papers in the highest citation decile declined noticeably. It appears likely that these subfields lost or suffered turnovers among leading scientists on their staffs, or possibly other kinds of disruptions occurred, resulting in reduced performance. General and internal medicine continues very strong, but in 1977–1980, of 639 papers, almost 33 percent were among the most highly cited 10 percent of all U.S. papers, while in 1981–1984, with 596 papers, only 23.9 percent appeared in the highest citation—still impressive, but a substantial reduction.

OCR for page 156
APPENDIX D 159 About this PDF file: This new digital representation of the original work has been recomposed from XML files created from the original paper book, not from the original typesetting files. Page breaks are true to the original; line lengths, word breaks, heading styles, and other typesetting-specific formatting, however, cannot be retained, In addition to the six subfields noted above, cancer and genetics-heredity produced fewer papers in 1981–1984, following substantial growth in the 1977–1980 and some typographic errors may have been accidentally inserted. Please use the print version of this publication as the authoritative version for attribution. period. Almost all of the intramural research in these subfields is supported by the National Cancer Institute (NCI). Unlike other institutes, whose obligations in constant dollars continued to increase between 1977 and 1980 (the period affecting 1981–1984 publications), NCI’s obligations declined steadily and substantially (NIH Data Book, 1982). The high correlation that exists between budgetary changes and publications three years later (r=.90 in NIH studies) suggests the possibility that the NCI intramural programs in these two subfields, and in virology, may have been affected, as extramural programs certainly were. Although the number of cancer and genetics-heredity papers produced in 1981–1984 declined, quality measures improved for both fields (except the average journal influence of cancer papers, which declined slightly). In 1977–1980 the intramural program produced almost 11 percent of all U.S. cancer papers. The percentage dropped to 8.7 in 1981–1984, but it is apparent that intramural papers played a substantial role in defining the top citation deciles. Intramural performance increased steadily in each four-year period, as the percent in the highest citation decile grew from 12 to 18 percent; academic institutions placed only 9.4 to 9.8 percent of their papers at this level. Ophthalmology was the only one of the most active intramural program disciplines to decline in every measure of size and quality between the mid-1970s to the mid-1980s. The activity index remained greater than 1 (1.4, down from 2.2), but top decile performance declined from a peak of 22.3 to 15.1 in 1981–1984. There are 17 subfields, not including miscellaneous clinical and miscellaneous biomedical research, in which relatively few NIH intramural scientists publish. In most of these “small” subfields, fewer papers were published in the early 1980s than in the late 1970s, though in eight of them relatively high citation records improved. In 1981–1984, the following subfields placed more than 20 percent of papers in the highest citation decile: allergy 31.8, arthritis 23.6, otorhinolaryngology 22.2, pediatrics 30.7, urology 33.3, nephrology 23.5, biomedical engineering 47.8, and microscopy 33.3. It must be remembered that these statistics are based on small numbers of papers. Nevertheless, the evidence here suggests that some areas of research are being held back, or at least not being encouraged or given the opportunity to grow in spite of the strength and prominence of existing scientific staff, while some other more highly favored (or highly budgeted) disciplines may be promoted beyond the capacity of NIH to attract the best available talent.