National Academies Press: OpenBook
« Previous: 7 Building and Fire Research Laboratory
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

8
Information Technology Laboratory

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

PANEL MEMBERS

Tony Scott, General Motors Corporation, Chair

Albert M. Erisman, Institute for Business, Technology, and Ethics, Vice Chair

Michael Angelo, Compaq Computer Corporation

Bishnu S. Atal, AT&T Laboratories-Research

Matt Bishop, University of California, Davis

Linda Branagan, Secondlook Consulting

Jack Brassil, Hewlett-Packard Laboratories

Aninda DasGupta, Philips Consumer Electronics

Susan T. Dumais, Microsoft Research

John R. Gilbert, Xerox Palo Alto Research Center

Roscoe C. Giles, Boston University

Sallie Keller-McNulty, Los Alamos National Laboratory

Stephen T. Kent, BBN Technologies

Jon R. Kettenring, Telcordia Technologies

Lawrence O’Gorman, Avaya Labs

David R. Oran, Cisco Systems

Craig Partridge, BBN Technologies

Debra J. Richardson, University of California, Irvine

William Smith, Sun Microsystems

Don X. Sun, Bell Laboratories/Lucent Technologies

Daniel A. Updegrove, University of Texas, Austin

Stephen A. Vavasis, Cornell University

Paul H. von Autenried, Bristol-Myers Squibb

Mary Ellen Zurko, IBM Software Group

Submitted for the panel by its Chair, Tony Scott, and its Vice Chair, Albert M. Erisman, this assessment of the fiscal year 2002 activities of the Information Technology Laboratory is based on a site visit by the panel on February 26-27, 2002, in Gaithersburg, Md., and documents provided by the laboratory.1

1  

U.S. Department of Commerce, Technology Administration, National Institute of Standards and Technology, Information Technology Laboratory Technical Accomplishments 2001, NISTIR 6815, National Institute of Standards and Technology, Gaithersburg, Md., November 2001; U.S. Department of Commerce, Technology Administration, National Institute of Standards and Technology, Report to the ITL Assessment Panel, National Institute of Standards and Technology, Gaithersburg, Md., February 2002; U.S. Department of Commerce, Technology Administration, National Institute of Standards and Technology, Information Technology Laboratory Publications 2001, National Institute of Standards and Technology, Gaithersburg, Md., February 2002.

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

LABORATORY-LEVEL REVIEW

Technical Merit

The mission of the Information Technology Laboratory (ITL) is to strengthen the U.S. economy and improve the quality of life by working with industry to develop and apply technology, measurements, and standards. This mission is very broad, and the programs not only encompass technical and standards-related activities but also provide internal consulting services in mathematical and statistical techniques and computing support throughout NIST.2 To carry out this mission, the laboratory is organized in eight divisions (see Figure 8.1): Mathematical and Computational Sciences, Advanced Networking Technologies, Computer Security, Information Access, Convergent Information Systems, Information Services and Computing, Software Diagnostics and Conformance Testing, and Statistical Engineering. The activities of these units are commented on at length in the divisional reviews in this chapter. Below, some highlights and overarching issues are discussed.

The technical merit of the work in ITL remains strong. As part of its on-site reviews, the panel had the opportunity to visit each of the divisions for a variety of presentations and reviews related to the projects currently under way. While it is not possible to review every project in the greatest detail, the panel has been consistently impressed with the technical quality of the work undertaken. The panel also particularly applauds ITL staff’s willingness to take on difficult technical challenges while demonstrating an appropriate awareness of the context in which NIST results will be used and the importance of providing data and products that are not just correct and useful but also timely. Many examples of programs with especially strong technical merit are highlighted in the divisional reviews.

The panel is very pleased to see the progress that has occurred in strategic planning in ITL. A significant development over the past year has been the emergence and acceptance of a framework under which the laboratory activities operate. The framework includes the ITL Research Blueprint and the ITL Program/Project Selection Process and Criteria. The panel observed that these descriptions and tools appear to be well institutionalized within each of the divisions and seem to be having a positive initial impact on improving the direction and efficacy of laboratory projects and programs. These frameworks were widely used in the presentations made to the panel, and the panel noted the emergence of a common “vocabulary” with respect to planning and strategy. Increased collaborations between divisions were also observed. The panel also continues to see progress in the divisions on rational, well-justified decisions about what projects to start and conclude and when to do so.

Program Relevance and Effectiveness

ITL has a very broad range of customers, from industry and government and from within NIST, and the panel found that the laboratory serves all of these groups with distinction. In addition to the panel’s expert opinion, many quantitative measures confirm the relevance and effectiveness of ITL’s programs. One is the level of interaction between laboratory staff and their customers, which continues to rise. Attendance is up at ITL-led and -sponsored seminars, workshops, and meetings; staff participation in standards organizations and consortia is strong; and laboratory staff have robust relationships with researchers and users from companies, governmental agencies, and universities.

2  

In February 2002, NIST management announced that the computing services functions currently housed in ITL will be moved into a separate unit, headed by a chief information officer (CIO) who will report directly to the NIST director. This transition is discussed in the “Program Relevance and Effectiveness” subsection, below.

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

FIGURE 8.1 Organizational structure of the Information Technology Laboratory. Listed under each division are the division’s groups.

Another visible measure of the quality and relevance of ITL’s work is the number of awards that laboratory staff receive from NIST, the Department of Commerce, and external sources. Examples include a Department of Commerce Gold Medal and an RSA Public Policy Award for the work on the Advanced Encryption Standard, an R&D 100 Award for the development of the Braille Reader, a series of awards from the National Committee for Information Technology Standards for leadership in standards-related activities such as the work on standards for geographic information systems, and the election of a staff member as a fellow of the American Society for Quality because of his work on

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

applying statistics to measurement sciences. These honors, spread across the various divisions, recognize outstanding technical and program achievement at numerous levels.

ITL’s interactions with and impact on industrial customers continue to improve each year, and the panel applauds the laboratory’s ability to produce and disseminate results of value to a broad audience. ITL primarily serves two kinds of industrial customers: computer companies (i.e., makers of hardware and software) and the users of their products (which include companies from all sectors, government, and, to some extent, the public). The divisional reviews later in this chapter contain many examples of how ITL makes a difference. Notable cases include the Advanced Networking Technologies Division’s success at raising the visibility of co-interference problems between IEEE 802.11 and Bluetooth wireless networks and NIST’s technical contributions to evaluating possible solutions; the Convergent Information Systems Division’s development of an application that can preview how compressed video appears on different displays, thus allowing producers to make decisions about the amount of compression in light of the equipment likely to be used by the target audience; and the Software Diagnostics and Conformance Testing Division’s facilitation of the development of an open standard and needed conformance tests for extensible markup language (XML). In addition to serving all of these customers, ITL projects also have been known to have an impact worldwide. For example, standards developed with NIST’s help and leadership in the security, multimedia, and biometrics areas are all used throughout the relevant international technical communities.

In last year’s assessment report,3 the panel expressed concerns about industry trends in standards development that would affect ITL’s ability to effectively and openly help industry adopt the most appropriate standards for emerging technologies. The growing use of consortia and other private groups in standards development processes places a burden on ITL, which has to strike a balance between its obligation to support and encourage open processes and its need to be involved as early as possible in standards-setting activities so as to maximize the impact of ITL’s experience and tools. In some cases, a delicate trade-off must be made between participating in a timely way in organizations that will set standards for the industry and avoiding endorsement of standards set by exclusive groups. ITL’s role as a neutral third party and its reputation as an unbiased provider of technical data and tools have produced significant impact in many areas and should not be squandered by association with organizations that unreasonably restrict membership. The panel continues to urge ITL to establish a policy to help divisions decide when participation in closed consortia is appropriate and to consider how NIST can encourage industry to utilize open, or at least inclusionary, approaches to standards development.

Given that consortia, in some form or another, are here to stay and that in some cases it will be vital for NIST to participate in these consortia, the panel supports the efforts recently made by ITL and NIST to work on the internal legal roadblocks to participation, but it suggests that this work could be supplemented by efforts to educate external groups, such as consortia members and lawyers, on ways to facilitate NIST’s timely participation and technical input. This is a customer outreach effort as well as a legal issue.

One customer that relies significantly on ITL’s products and expertise is the federal government, which often uses NIST standards and evaluation tools to guide its purchase and use of information technology (IT) products, particularly in the computer security area. An example is the Computer Security Division’s Cryptographic Module Validation Program (CMVP), which has enabled purchasers,

3  

National Research Council, An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2001, National Academy Press, Washington, D.C., 2001.

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

including the U.S. government, to be sure that the security attributes of the products that they buy are as advertised and appropriate. In the Information Access Division, the new Common Industry Format (CIF) standard provides a foundation for exchanging information on the relative usability of products and is already being used for procurement decisions by several large enterprises. Another key ITL activity relevant to the federal government is the work on fingerprint and face recognition. NIST standards and data have played a key role in the development of automated fingerprint identification systems. Also, since the attacks of September 11, 2001, there has been significant pressure to increase the reliability of biometric recognition technologies, especially face recognition. ITL’s existing, long-term programs and expertise in face, fingerprint, and gait biometrics will provide test data that will help drive system development and help government evaluation of systems capabilities.

Programs such as the work on biometrics, especially face recognition, highlight a question relevant to many information technology activities: that is, in what context will technological advances be used? Information technology is often an enabling technology that will produce new capabilities with expected and unexpected benefits and costs.4 The panel acknowledges that ITL’s primary focus is on technical questions and technical quality, but it emphasizes that for the laboratory’s work to be responsible and for the results to be taken seriously in the relevant communities, recognition of the context in which new technologies will be applied is very important. This context has two elements: the deployment of the technology and the social implications of the technology. In the first area, the deployment questions relate to the functionality of the systems in which new technical capabilities will be used. A testbed is not necessarily meant to determine the “best” technology but rather the one that works well enough to meet the needs for which it is being developed. Often, the process of considering the possible applications of a technology results in a broader appreciation of the potential benefits. For example, appropriate security is actually an enabler that allows e-business, the globalization of work, collaboration across geography, and so on.

Understanding the ultimate goals for new technologies relates to the social implications questions. For example, security has serious implications for privacy. The panel emphasizes that in many of the ongoing programs—such as the work on the potential use of face recognition technologies as security systems in public places—ITL staff made long and arduous efforts to comply with existing privacy legislation. However, when describing the NIST results to public groups (such as the panel), staff should also be sure to take the time to acknowledge the privacy questions and describe potential future issues, as well as discussing the capabilities and benefits of the technological advancements.

Following are two examples of areas in which the panel believes that the potential societal issues or the actual context in which technologies would be used were not being fully considered. The first example is the suggestion that a commercial application for face recognition could be that of having an automated teller machine (ATM) recognize a user with Hispanic features and automatically switch to using Spanish. As many people of Hispanic (or Swedish or Asian) appearance are not in fact speakers of the “native” language implied by their looks, this is a naïve (and perhaps inappropriate) example of the technology’s potential. The second example is in the area of pervasive computing, where NIST’s work on “smart” meeting facilities was demonstrated for the panel. Recording meetings for search and archiving can offer significant benefits in some contexts, but it can also inhibit certain types of discussions. For example, the effectiveness of brainstorming sessions or examinations of “what if” scenarios

4  

How the social context can provide a framework for information technology development is discussed at length in the following report: Computer Science and Telecommunications Board, National Research Council, Making IT Better: Expanding Information Technology Research to Meet Society’s Needs, National Academy Press, Washington, D.C., 2000.

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

might be significantly limited if the participants thought the discussion might later be taken out of context and broadcast.

In addition to strong relationships with customers in industry and in the federal government, ITL places significant emphasis on effectively serving its customers within NIST. The panel commends the focus in both the Mathematical and Computational Sciences Division and the Statistical Engineering Division on building robust collaborations with scientists and engineers throughout ITL and the other NIST laboratories. One example is the work of the Mathematical and Computational Sciences Division on the mathematical modeling of solidification with staff from the Materials Science and Engineering Laboratory; another is the Statistical Engineering Division’s development of a method to combine data from diverse building materials studies for the Building and Fire Research Laboratory.

A primary current responsibility of ITL is that of IT support for all of NIST. The relevant activities—which include the support and maintenance of campus networking, personal computers (PCs), administrative applications (such as accounting software), and telephones—are performed by the Information Services and Computing Division. These service programs were unified in this division in December 2000, and the panel is very pleased at the significant progress observed in the past 2 years. The quality and effectiveness of the support functions have improved and so has the overall planning and strategic approach to providing the relevant services. A “NIST IT Architecture” has been developed, and it should help provide context and scope for each of the subarchitectures and various support functions at NIST. Other recent accomplishments include the formation and centralization of a NIST-wide help desk and increased standardization around core processes such as PC procurement. Issues do still exist, however, including a lack of ability for this division’s staff to enforce or even check compliance with centralized IT standards or policies. For example, many units at NIST do their own systems administration, which could result in uneven implementation of appropriate security applications.

The key issue for IT services at NIST in the next year will be an organizational transition. In February 2002, NIST management announced that the support functions currently housed in ITL will be moved out of the laboratory into a separate unit, headed by a chief information officer (CIO) who will report directly to the NIST director. Since a significant problem for the current unit is the difficulty in getting the NIST laboratories to embrace consistent, institutionwide standards for IT systems, raising the services unit to a level equivalent with the laboratories may provide needed visibility for the issue. Another factor that may help is the emphasis by the current director of this new unit (the acting CIO) on demonstrating to the other NIST laboratories how IT services can facilitate their research and how standardizing basic applications can save time and money. Achieving acceptance of this new unit and centralized IT support across NIST will be a serious leadership challenge, as this approach will be a cultural shift for NIST. The panel encourages benchmarking with organizations such as Agilent Technologies that have successfully made such a transition.

Making the IT services component of NIST a separate unit rather than a division of ITL may bring it closer to other laboratories; however, it is important that this unit maintain close ties with ITL programs. For example, some of the work being done in the Computer Security Division can and should be applied to the security of the NIST system. Work on technologies for meetings can be tested and effectively used throughout NIST. Applying the development work of ITL’s research divisions to NIST as a whole will require the continued tracking in the services unit of relevant ongoing projects and the recognition in ITL of the potential for using NIST as a whole as a testbed.

ITL has done a remarkable job of becoming more customer-oriented over the past several years. The panel applauds the laboratory’s efforts in outreach and notes that the progress reflects improvement in a whole range of areas—for example, gathering wider and more useful input, helping with project selection, and increased dissemination and planning for how customers will utilize NIST results and

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

products. ITL has supported this increased focus on its customers by measuring outputs and outcomes that provide data on how the laboratory is doing in this area. (One example is that of tracking the number of times ITL-developed standards and technology are adopted by government and industry.)

Now that ITL is serving its customers so well, the panel wants to suggest that some attention could also be paid to strengthening the laboratory’s reputation and stature with its colleagues in relevant research communities. Customers are uniquely positioned to assess the timeliness of and need for ITL results, but ITL’s peers can and should assess the technical excellence of the laboratory’s work. A variety of reasons support having input from both groups, that is, having a balanced scorecard for the laboratory’s portfolio. One reason is that sometimes customer satisfaction is not the right metric, since NIST can, and in some cases should, hold companies to higher standards than the companies might wish. Another reason is that elevating the stature of ITL researchers in their peer communities can raise NIST’s credibility with its customers. Therefore, in the future the panel hopes to see increased emphasis on ITL’s visibility within relevant research communities.

Increased visibility, such as ITL’s successful efforts to improve customer relationships, can be driven by the use of appropriate metrics. It is not entirely clear what outputs or events will effectively measure ITL’s work in this area. Possibilities include but are not limited to the number of times that staff are named as nationally recognized fellows of professional organizations (such as IEEE, the Association for Computing Machinery [ACM], the American Physical Society, and the American Society for Quality), the number of times ITL staff are featured speakers at high-profile conferences, and the number of staff publications in top-tier peer-reviewed IT journals. The metrics will obviously depend on the field in which ITL’s research is occurring. The panel acknowledges that it is often inappropriate to compare NIST researchers directly with people working in industry research units or at universities, because ITL’s role of producing test methods, test data, standards, and so on is different from industrial or academic activities and is often unique. However, ITL’s peers at these other institutions are still in a position to recognize and evaluate the technical merit and quality of the NIST programs. The panel is not suggesting that recognition by external peer communities should replace responsiveness to customer needs as a primary focus, but it is instead suggesting that ITL perform the difficult balancing act of putting more emphasis on publication and interaction in the relevant research community without losing its focus on its customers.

Laboratory Resources

Funding sources for the Information Technology Laboratory are shown in Table 8.1. As of January 2002, staffing for the laboratory included 389 full-time permanent positions, of which 319 were for technical professionals. There were also 105 nonpermanent or supplemental personnel, such as postdoctoral research associates and temporary or part-time workers.

The panel’s primary concern in the area of human resources is the April 2002 retirement of the current director of ITL. The panel has observed and laboratory staff have explicitly stated that morale is at an all-time high in ITL, due in large part to the director’s leadership style and direction. A great deal of concern has surfaced among the staff over the process for filling the director’s slot, how long it will take, and what the caliber and style of the next director will be. The panel recommends that NIST leadership focus on providing clear communication to staff about the selection criteria and frequent updates as to the progress of the search and hiring process. Sharing relevant information will certainly help the transition proceed more smoothly.

One facilities issue highlighted in last year’s assessment report was the location of five divisions in NIST North. The existence and use of NIST North is a perennial issue. The panel recognizes that the

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

TABLE 8.1 Sources of Funding for the Information Technology Laboratory (in millions of dollars), FY 1999 to FY 2002

Source of Funding

Fiscal Year 1999 (actual)

Fiscal Year 2000 (actual)

Fiscal Year 2001 (actual)

Fiscal Year 2002 (estimated)

NIST-STRS, excluding Competence

31.6

31.9

44.4

38.8

Competence

1.5

1.6

1.1

1.3

STRS—Supercomputing

12.1

12.0

11.9

10.0

ATP

1.8

2.4

2.3

2.0

Measurement Services (SRM production)

0.0

0.0

0.1

0.5

OA/NFG/CRADA

8.4

9.9

12.2

14.6

Other Reimbursable

0.5

1.6

1.0

0.3

Agency Overhead

14.4

16.4

18.4

28.2

Total

70.3

75.8

91.4

95.7

Full-time permanent staff (total)a

381

381

368a

389

NOTE: Funding for the NIST Measurement and Standards Laboratories comes from a variety of sources. The laboratories receive appropriations from Congress, known as Scientific and Technical Research and Services (STRS) funding. Competence funding also comes from NIST’s congressional appropriations but is allocated by the NIST director’s office in multiyear grants for projects that advance NIST’s capabilities in new and emerging areas of measurement science. Advanced Technology Program (ATP) funding reflects support from NIST’s ATP for work done at the NIST laboratories in collaboration with or in support of ATP projects. Funding to support production of Standard Reference Materials (SRMs) is tied to the use of such products and is classified as “Measurement Services.” NIST laboratories also receive funding through grants or contracts from other [government] agencies (OA), from nonfederal government (NFG) agencies, and from industry in the form of cooperative research and development agreements (CRADAs). All other laboratory funding, including that for Calibration Services, is grouped under “Other Reimbursable.”

aThe number of full-time permanent staff is as of January of that fiscal year, except in FY 2001, when it is as of March (due to a reorganization of ITL that year).

quality of the space in NIST North is significantly better than what would be available on campus; however, access to these improved facilities does not compensate for the distance from the rest of the campus for two of the ITL divisions—the Mathematical and Computational Sciences and the Statistical Engineering Divisions. The distance inhibits informal interactions of the staff of these two divisions with their collaborators in the other laboratories on the main campus. Thus, ITL management has submitted the space requirements of these divisions to NIST management, which will be making revised space allocation decisions related to the new Advanced Measurement Laboratory (AML), due to be completed in 2004. The panel encourages NIST management to make a serious effort to move these two divisions back to the main campus.5 However, 2004 is still several years away. In the meantime, the panel continues to note that a mix of systems taking into account technological and social factors could help compensate for the

5  

One group in the Mathematical and Computational Sciences Division, the Scientific Applications and Visualization Group, is already located on the main campus.

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

distance. Tools such as videoconferencing, Web collaboration packages, and Web broadcasting can support nonphysical interactions, but regular, scheduled (and subsidized) opportunities for face-to-face meetings are necessary to make these technical solutions most effective. These approaches are applicable to the NIST North/main campus gap, as well as to the Gaithersburg/Boulder divide.

A second facilities issue raised in the 2001 assessment report was the poor network connectivity of NIST to the outside world. The panel was very pleased to learn that since the last review, NIST has joined the Internet 2 project, thus dramatically improving the connectivity and placing NIST on a par with the major universities and industrial research organizations that participate in this project. The next step will be educating researchers in the other laboratories at NIST about how to take full advantage of this new capability.

The panel met with staff in “skip-level” meetings (sessions in which management personnel were not present). The key message from these meetings was that in the past few years, under the current management of the laboratory, ITL has become an especially enjoyable place to work, noted for such attributes as respect for the individual, stability, an appropriate level of flexibility, and focus on visible results. The panel also observed this high level of morale in visits to individual divisions. Turnover in ITL was approximately 9 percent this year, down slightly from last year. Although turnover has decreased in industry in the past year and is now about comparable to that in ITL, over the last several years ITL has had a remarkably low comparative turnover rate for an IT organization. The panel applauds laboratory and division management for creating such a positive work environment.

Some issues were brought up in the skip-level meetings. The panel cannot judge if these concerns are broad-based or isolated but does note that perhaps laboratory management should be aware of them. For example, ITL staff said that while relationships with the other NIST laboratories had improved, they still felt that ITL did not have the same status or prestige that other laboratories enjoy. The panel notes that continued interaction with staff in other laboratories, internal and external recognition of staff, and cross-laboratory projects will help ameliorate imbalances or perceptions of “second-class” status. The shift of IT support services to a separate unit also might help emphasize to the rest of NIST that the core mission of ITL is the same as that of the rest of the laboratories. Other concerns expressed by staff included perceived inconsistencies in performance measurement and some related frustrations about apparently unequal burdens of work owing to the difficult process for firing poor performers in the federal system. Such perceptions, if they indeed exist on a broader scale in ITL, would not be unique to ITL, NIST, government agencies, or even businesses in general.

Laboratory Responsiveness

The panel found that, in general, ITL has been very responsive to its prior recommendations and observations. The panel’s comments appear to be taken very seriously, and the suggestions made in the assessment reports are often acted on, especially as related to the redirection and conclusion of projects. When advice is not taken, ITL usually provides a good rationale for why a given action has not occurred. Examples of positive responses to suggestions made in last year’s report include the improved strategic planning observed in the Mathematical and Computational Sciences Division, the redirection of the work on distributed detection in sensor networks in the Advanced Networking Technologies Division, the transfer of the latent fingerprint workstation to a Federal Bureau of Investigation (FBI) contractor in the Information Access Division, and the work on connecting NIST to Internet 2 in the Information Services and Computing Division. More discussion of responsiveness and of areas needing continued attention is presented in the divisional reviews below.

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

In some areas, the issues raised by the panel are long-term questions or areas in which changes are not entirely within ITL’s power. In these cases, the panel looks to see if serious effort has been made. Usually the panel observes some progress and plans to follow up on the issues in future assessments. The location of the MCSD and SED Divisions in NIST North is one such issue, and while the panel is glad to learn that their relocation in conjunction with the occupation of the AML is being considered, the panel will be watching to see whether this occurs and how ITL handles the time prior to AML’s completion. Another such issue is the growing use by industry of consortia and other private groups to set industry standards. The panel recognizes that this trend cannot be controlled by ITL, but it would like to see further consideration of internal policies on use of closed consortia and of ways to encourage open standards development.

MAJOR OBSERVATIONS

The panel presents the following major observations:

  • The panel is impressed with the progress that has occurred in strategic planning in the Information Technology Laboratory (ITL), particularly in the emergence and acceptance of a framework under which laboratory activities operate. The framework includes an ITL Research Blueprint and ITL Program/Project Selection Process and Criteria.

  • ITL has done a remarkable job of becoming more customer-oriented over the past several years. The panel applauds the laboratory’s efforts in outreach and notes that the progress reflects improvement in a whole range of areas, from gathering wider and more useful input to help with project selection to increased dissemination and planning for how customers will utilize NIST results and products.

  • The strong customer relationships now need to be balanced by robust visibility and recognition in ITL’s external peer communities. Publications in top-tier journals, presentations at high-profile conferences, and awards from ITL’s peers will help confirm the technical merit of the work done at NIST and will add to the laboratory’s credibility with its customers.

  • Conveying awareness of the social issues related to ITL’s technical work in areas such as biometrics is an important element of the credible presentation of ITL results to diverse audiences. In certain areas, considering the technical and social context of how the work will be used may help focus the research on the most appropriate questions.

  • The shift of the information technology (IT) support functions to a new unit reporting directly to the NIST director is an opportunity and a challenge for NIST leadership. If this new unit can convince the NIST laboratories to embrace consistent, institutionwide standards for IT systems, it will be an important step and a major cultural shift at NIST. Appropriate emphasis is being placed on demonstrating how IT services can facilitate research and how standardizing basic applications can save time and money.

  • The retirement of the current director of ITL is clearly a source of concern within the laboratory. The panel recommends that NIST leadership focus on communicating clearly with staff about the selection criteria for the director’s replacement and that it supply staff with frequent updates on the progress of the search and hiring process. Sharing of relevant information will certainly help the transition proceed more smoothly.

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

DIVISIONAL REVIEWS

Mathematical and Computational Sciences Division

Technical Merit

The mission of the Mathematical and Computational Sciences Division is to provide technical leadership within NIST in modern analytical and computational methods for solving scientific problems of interest to U.S. industry. To accomplish this mission, the division seeks to ensure that sound mathematical and computational methods are applied to NIST problems, and it also seeks to improve the environment for computational science at large. Overall, the panel is very impressed with the quality of the division’s work and the strength of its collaborations with other divisions and laboratories at NIST. The division is on track in its execution of a large, ambitious project—the Digital Library of Mathematical Functions (DLMF)—and it is becoming deeply involved in a strategic NIST-wide project on quantum computing. The panel also observes that the division’s strategic planning process is strong and that it is improving.

The Mathematical and Computational Sciences Division is organized in four groups: Mathematical Modeling, Mathematical Software, Optimization and Computational Geometry, and Scientific Applications and Visualization have common themes, such as better mathematical models, better solvers, application of parallelism, and the development of reference implementations and data sets. The projects are mostly collaborative, with collaborators chosen from other NIST laboratories and from external organizations. The division’s overall portfolio is a balanced mixture of short-term and long-term projects and of projects with small and large numbers of staff. Last year’s assessment report raised some questions about the division’s project selection process and strategic planning, and this year the panel was impressed to see that significant progress had been made in this area. The division has a number of ongoing planning activities at division, laboratory, and NIST-wide levels, and it now has a good mixture of bottom-up and strategically generated projects. The triennial update of the division’s strategic plan is scheduled for later in 2002, and the panel looks forward to reviewing the revised plan in next year’s assessment.

The work on the Digital Library of Mathematical Functions is a good example of the focus on reference materials that makes the division’s products so useful to a broad array of customers. The goal of this ambitious project is to provide a Web replacement for the classic National Bureau of Standards publication Handbook of Mathematical Functions by Abramowitz and Stegun.6 The DLMF will be extremely important to scientists and engineers who need access to the latest tools and algorithms related to special mathematical functions, and this work is precisely in line with the division and NIST missions. The division has formulated a good strategy for developing the DLMF; it is using outside editors, attracting external funding (the division was awarded a competitive grant from the National Science Foundation for this project), attracting internal NIST funding, and developing a single writing style to be used in all chapters. Each chapter includes mathematical properties, methods of computation, and graphs, among other useful information. In the past year, progress on DLMF has continued at a good pace; drafts are now available for much of the work, and validation of the information is taking place. The current schedule calls for formal public release of the completed library in 2003, at which

6  

M. Abramowitz and I.A. Stegun, eds., Handbook of Mathematical Functions, with Formulas, Graphs, and Tables, Applied Mathematics Series 55, National Bureau of Standards, Washington, D.C., 1964.

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

time the panel expects the DLMF Web site to become one of the most popular mathematical Web sites in the world. In the upcoming year, the critical challenges for this project are not primarily technical, but relate more to program management, as the impending deadlines require a significant amount of editorial and production work. The panel hopes that NIST will allow staffing levels to remain sufficient to ensure a high-quality final product.

An example of the division’s effective work in mathematical modeling is the work on Object Oriented Micromagnetic Framework (OOMMF). The goal of this project is to provide a platform for two- and three-dimensional modeling of magnetic phenomena associated with magnetic storage media. (The two-dimensional code is complete, but work is still under way on the three-dimensional version.) The code is written to be highly configurable; it uses Tcl/Tk to provide easy scripting capabilities and cross-platform graphical user interfaces, and the solvers are open-source C++. In 2001, 11 papers were published using results generated by OOMMF.

An impressive new project is the work on quantum information processing. This “hot” area is attracting a great deal of attention from the research community, and the panel believes that NIST is well positioned to have an impact here, owing to NIST’s Nobel Prize-winning physicists, who have expertise relevant to quantum computing, and to the strength of the Mathematical and Computational Sciences Division both in continuous modeling and simulation and in discrete algorithms.

The work on mathematical modeling of solidification is an excellent example of how the applied mathematicians in the Mathematical and Computational Sciences Division can effectively leverage their expertise to produce significant impact from their consulting and collaborative roles. In this project, one division staff member has teamed up with about six dozen people from the NIST MSEL and a large group of university colleagues to model electrodeposition in support of experiments in this area, to model interfacial instabilities during the cooperative growth of monotective materials, and to develop models of solid-state order-disorder transistions. The staff member from the Mathematical and Computational Sciences Division was elected a fellow of the American Physical Society for his work in the area. Another example of a successful collaborative project is the work on machining process metrology, in which a researcher from this division is working with 10 people from three other NIST laboratories (MSEL, MEL, and PL).

The Scientific Applications and Visualization Group joined the Mathematical and Computational Sciences Division last year, relocating from another division within ITL. This group has a strong positive impact through its collaborations with NIST scientists who require state-of-the-art algorithms and architectures to get the performance they need from their scientific codes. As the sheer volume of data in scientific computations increases, visualization techniques become increasingly essential in order for results to be effectively utilized and interpreted. This group has an excellent combination of a wide range of technical skills and a strong collaborative style, as demonstrated particularly by the cluster of projects around concrete modeling. The group also has a strong role in creating standards; for example, a few years ago it facilitated the development of an Interoperable Message Passing Interface (IMPI) standard, and this standard is now having an impact on commercial software.

Program Relevance and Effectiveness

The Mathematical and Computational Sciences Division staff is well connected, well published, and influential on organizing committees and editorial boards. Their work is well regarded by customers both inside and outside NIST. All of the division’s projects have an excellent record of dissemination of results, through a variety of mechanisms such as software, publications, Web services and documentation, conference talks, and workshops. A key factor in the success and impact of the division is the high

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

quality of the staff. Overall, the technical excellence of the division’s staff is demonstrated in a variety of ways. Personnel receive numerous internal and external awards (including two elections to professional society fellowships in 2001); the division continues to produce a significant number of refereed publications and invited talks; staff serve as editors for many journals (including ACM Transactions on Mathematical Software, for which the division chief is editor-in-chief); and division personnel fill many senior leadership positions in professional societies and working groups and on conference organizing committees.

Last year, the panel cited the Java Numerics project as an excellent example of work with impact and vision and as a good use of NIST’s scientific leadership role. Clearly NIST agrees, as the leaders of the project were awarded the NIST Bronze Medal in 2001. In 2002, this project continues to produce important results. Another impressive project that supports NIST’s core mission is the work on Sparse Basic Liner Algebra Subprograms (BLAS). As with Java Numerics, this work on mathematical software standards is able to impact the commercial landscape owing to the high quality and reputation of the division’s scientists.

Last year’s assessment report discussed the importance—as an element of maintaining the reputation of NIST scientists—of supplementing the many articles coauthored by division staff and appearing in the journals of their collaborators’ fields with publication in their own disciplinary journals. The division’s FY 2001 annual report7 lists the publications that appeared or were accepted in refereed journals this past year; about 12 are in journals in mathematics, scientific computation, or visualization, another 20 or so are in journals in other disciplines, and the last handful are harder to categorize. This represents a balance between the division’s mission of consulting and collaboration and the need for its staff to be recognized as leaders in the mathematical and computational research communities. The former requirement should not be allowed to overshadow the latter one, and the panel encourages the division to maintain and in some cases to raise its visibility at premier mathematical and computational conferences in the areas of division expertise. For example, several projects in the division have a significant component involving computational geometry. The division therefore could consider raising its profile at the annual ACM Symposium on Computational Geometry. Familiarity with the journals and activities in these fields is an important factor in identifying opportunities, and the panel was pleased to learn that the availability of online journals is easing somewhat the problems noted last year with maintaining the NIST library’s journal collections.

Since the terrorist attacks in September 2001, a variety of programs at NIST have been affected by the federal government’s changing priorities. In the Mathematical and Computational Sciences Division, some research projects will probably achieve a higher profile because of their relevance for homeland security. Examples include the projects on image processing and on laser radar (LADAR) systems, both of which will help protect the safety of first responders to terrorist events or other disasters. The division and ITL are participating in NIST-wide planning for activities connected to homeland security and counterterrorism. Indirect effects, such as increased budget uncertainty at various government agencies with which the division works, will also have an impact on the division.

7  

U.S. Department of Commerce, Technology Administration, National Institute of Standards and Technology, Mathematical and Computational Sciences Division Summary of Activities for Fiscal Year 2001, National Institute of Standards and Technology, Gaithersburg, Md., January 2002.

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

TABLE 8.2 Sources of Funding for the Mathematical and Computational Sciences Division (in millions of dollars), FY 1999 to FY 2002

Source of Funding

Fiscal Year 1999 (actual)

Fiscal Year 2000 (actual)

Fiscal Year 2001 (actual)a

Fiscal Year 2002 (estimated)

NIST-STRS, excluding Competence

3.3

3.6

4.1

7.2

Competence

0.2

0.2

0.1

0.1

STRS—supercomputing

0.7

0.6

3.4

0.6

ATP

0.1

0.1

0.5

0.5

OA/NFG/CRADA

0.4

0.7

0.9

1.9

Other Reimbursable

0.0

0.1

0.0

0.0

Total

4.7

5.3

9.0

10.3

Full-time permanent staff (total)b

30

27

39b

39

NOTE: Sources of funding are as described in the note accompanying Table 8.1.

aThe difference between the FY 2000 and FY 2001 funding and staff levels reflects the reorganization of ITL, in which the Scientific Applications and Visualization Group was moved out of the Convergent Information Systems Division and into this division.

bThe number of full-time permanent staff is as of January of that fiscal year, except in FY 2001, when it is as of March.

Division Resources

Funding sources for the Mathematical and Computational Sciences Division are shown in Table 8.2. As of January 2002, staffing for the division included 39 full-time permanent positions, of which 36 were for technical professionals. There were also 15 nonpermanent or supplemental personnel, such as postdoctoral research associates and temporary or part-time workers.

Resources in the division are tight, and the panel is concerned about several effects, related mainly to the constraints on the number of permanent staff, which has been basically constant for several years (the rise in Table 8.2 is due to an organizational change in which a new group was added). In this environment, the division has not been able to hire permanent staff to address continuing and emerging needs in computational science at NIST. One such area is quantum computing, where the division cannot recruit new permanent staff and is instead making good use of outside consultants and collaborators and a new postdoctoral research associate in this area. Traditional areas that are also short-staffed include numerical analysis, mathematical software, and optimization. The panel is particularly concerned about the situation in the Mathematical Software Group, where staffing constraints have limited the group’s ability to explore new projects and forced several existing projects into “maintenance mode.” While the ongoing work in the group is all meritorious, the panel believes that the group is being stretched dangerously thin.

One factor affecting the division’s ability to hire new staff is the virtual lack of attrition recently among permanent staff. This lack of turnover reflects the high morale observed in the division. The panel found that individual researchers are enthusiastic about their work and their management. Staff praised laboratory management’s fostering of good communications both horizontally and vertically; the panel notes that this is quite a change from 3 years ago, when morale problems were severe. Indeed,

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

many positive comments were made about the entire laboratory and division management chain during the panel’s meetings with individual staff members. Taking the rarity of new hires to permanent staff as a given, management has made effective use of affiliated faculty at area universities and of postdoctoral research associates. Unfortunately, space in this division is tight, limiting the possible number of summer visitors.

Given the many demands on personnel’s time in the current environment, the division must make careful decisions about when and how to develop new revisions of software packages. In many current software projects, the developers are working on a new release for an already successful package. The panel believes that the strategic goals of the division should be taken into account when deciding the priorities with respect to which packages receive new releases. In particular, the panel encourages the division to develop a set of criteria on which to base decisions concerning the time at which work on a second release should begin and what the goals of the release should be. Such criteria should emphasize how the second release would promote the overall mission and goals of the division. In some cases, the decision may be that the most strategic use of resources would be to cease further development on one package and move to a new project.

Another area in which the division needs to be careful about effective deployment of staff time is the long-term maintenance of Web resources such as the Guide to Available Mathematical Software (GAMS), Matrix Market, and the Template Numerical Toolkit (TNT). The division has a huge Web presence (which the panel applauds), and this presence is bound to increase with the release of DLMF and other projects. Many traditional NIST activities produce standard reference data or materials, which are then distributed to customers by NIST’s Office of Measurement Services. This static model is not a good fit to the Mathematical and Computational Sciences Division’s standard information resources, but the division would certainly benefit from a new approach that allowed the research groups to be relieved of some of the more mechanical tasks required to maintain these resources on the Web. A solution to this problem is not obvious, because proper, long-term maintenance of resources such as DLMF requires the attention of a mathematician (not just a Webmaster), yet this mathematician’s time might be better spent on new projects. Indeed, finding a balance between providing useful and up-to-date technical information on the Web and having time to develop new research activities is an issue for most scientific and research organizations that post technical Web pages, and NIST can provide leadership in this area. As a first step, the panel suggests that the division think about the factors that might go into deciding at what level to carry out long-term maintenance of Web pages and that it develop a policy governing these decisions.

As has been noted in many past reports, the housing of most of the division at NIST North makes informal interaction between division staff and personnel on the main campus difficult. This is a significant disadvantage for the division, as collaborative efforts with other NIST laboratories are a primary focus. Once a collaboration has begun, the physical distance is only an inconvenience, but many new collaborations, especially those in new areas, originate from casual contacts that are not currently available to the staff at NIST North. Increasing the difficulty of discovering new areas for cooperative work may have negative long-term impacts on the vitality of the division’s project mix. A related concern is that the division’s new group, Scientific Applications and Visualization, is currently located on the main campus, not in NIST North, and the panel did not observe much interaction between this group and the rest of the division, probably because of the physical separation. Another locational issue is the relationship between the division’s Gaithersburg personnel (which include division management) and its small group at NIST Boulder. Some of the Boulder staff have expressed a desire for closer connections with the rest of the division. Division management should consider mechanisms to increase Boulder-Gaithersburg interactions, such as a small dedicated travel fund for staff travel between the two

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

sites. It would also be useful to have some Boulder staff travel to Gaithersburg for the annual assessment of the division.

Advanced Networking Technologies Division

Technical Merit

The mission of the Advanced Networking Technologies Division is to provide the networking industry with the best in test and measurement technology. This mission statement is appropriate, and it accurately reflects the NIST and laboratory missions within the context of technologies relevant to this division’s work. The division focuses on using test and measurement technologies to improve the quality of networking specifications and standards and to improve the quality of networking products based on public specifications. In emerging technology areas, the division also performs modeling and simulation work to help ensure that specifications produced by industry and standardization organizations are complete, unambiguous, and precise. The panel finds that the division’s activities over the past year are definitely relevant and effective and that the programs encompass several of the currently important areas in networking research.

The work of the Advanced Networking Technologies Division is of consistently high quality, and the panel is pleased to see that the incremental improvements observed over the past 3 years are continuing. The organization of ongoing programs around coherent research themes has produced good synergy and allowed more communication and collaboration among the research groups. The themes also provide continuity as projects are completed and new activities initiated.

The Advanced Networking Technologies Division consists of three groups: High Speed Network Technologies, Wireless Communication Technologies, and Internetworking Technologies. Currently, the division’s work is organized in six projects: Networking for Pervasive Computing, Wireless Ad Hoc Networks, Agile Switching, Internet Telephony, Internet Infrastructure Protection, and Quantum Information Networks. These projects are generally well focused on achieving specific and valuable goals and are well directed in support of the NIST mission. The panel is particularly pleased by the balance among the projects: half are driven purely by needs of relevant external communities, and half are projects coordinated across ITL and other NIST laboratories. A good mix of time scales also exists, as two projects are aimed at having short-term impacts, three at intermediate-term effects, and one at long-term goals. Below, the panel describes some of the highlights and issues observed in its assessment of the division’s activities.

A great many activities are under way in the Networking for Pervasive Computing area. Two of these activities are aimed at supporting the development of networking standards for relevant devices. The first focuses on issues surrounding how to craft the various ubiquitous wireless standards (e.g., IEEE 802.15 and IEEE 802.11) so that they do not conflict within the unlicensed 2.4-GHz band. The original designers of the relevant standards all assumed that simply by complying with Federal Communications Commission (FCC) regulations for operation in this band, their technology would not conflict with the operation of other radio technologies sharing the spectrum. However, it has now become clear, largely through work done in the Advanced Networking Technologies Division, that successful coexistence will almost certainly require more than compliance with FCC regulations, and NIST has taken an important leadership role on questions related to reconciling the standards. Division staff have extended their earlier work on formal modeling of Bluetooth and simulation of the interactions between it and IEEE 802.11, and they are now developing tools to assess the effectiveness of various methods of coexistence, including synchronized receivers and combined radios. These results are valuable, and the

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

panel is particularly impressed with how aggressively and effectively the division tackled the problem. The timely information coming out of NIST will allow the IEEE (Institute of Electrical and Electronics Engineers) groups to incorporate the division’s solutions into the standards. The second effort in the area of networking standards for pervasive computing devices focuses on the analysis of the resource discovery protocols being developed for ubiquitous computing systems. Current efforts include work on modeling service descriptions for Jini and Universal Plug and Play (UPnP). The panel continues to be impressed with this activity and suggests extending the work to the Internet Engineering Task Force Service (IETF) Location Protocol.

The Wireless Ad Hoc Networks project was formed this year by combining the division’s work on technologies and standards for mobile ad hoc networks (MANETs) and for smart sensor networks. The work on MANETs encompasses both analysis and simulation. While developing the simulations, division staff evaluated the effectiveness of two popular simulation environments, OPNET and NS. Thus, one valuable outcome of the project is a forthcoming report comparing the usefulness of these two environments for simulating MANETs; this information should help drive the future evolution of these simulation tools. The division’s research on MANET routing criteria focuses on Kinetic Spanning Trees and clustering structures; this new activity has considerable promise and is well aligned with current work in this field outside NIST. In the smart sensors area, the work on distributed detection in sensor networks has been redirected into investigating networking protocols and distributed algorithms for support of sensor networks, as suggested by the panel in last year’s assessment.

In the Agile Switching project, the division has completed its project on modeling, evaluation, and research of lightwave networks (MERLiN). This year’s focus is now on extending the work to multilayer restoration multiprotocol label switching (MPLS) optical restoration and recovery. As part of this work, division staff developed a modeling tool called GHOST, which promises to have wide utility in research and in industry for analyzing approaches to restoration and recovery. In addition to developing this tool, staff are also using GHOST to investigate various aspects of restoration, particularly those involving large-scale failures and multiple simultaneous failures. A new facet of this project is the work on extending GHOST to allow the investigation of the interaction between failure/recovery and quality of service-based traffic engineering; this effort promises to produce valuable results by next year’s assessment. The panel notes that integrating and aligning the new project on game theoretic approaches to analyzing failure and restoration with the work on extending GHOST would benefit both projects.

The past year has seen a great deal of progress in the project on Internet Telephony (voice over Internet Protocol [IP]). The staff spent the first year of this project (2000) learning the technology and building an interesting set of diagnostic and testing tools for session initiation protocol (SIP)-based call signaling. The development of such an interoperability test tool is valuable to the community, and the Web-enabled SIP load generation and trace capture elements of this tool are already demonstrating their utility by helping implementers tease out subtle interoperability problems. Now that the basic pieces of this project have achieved critical mass, the panel suggests that it would be advantageous to expand the effort to include the associated protocol machinery that surrounds basic call signaling (such as telephony routing over IP, telephone number mapping, and call routing). Addressing questions related to the other elements around SIP-based call signaling would significantly enhance NIST’s contributions in this area.

In early 2001, the Advanced Networking Technologies Division completed its valuable work on developing reference implementations for Internet Protocol Security (IPsec). Now the emphasis has shifted to Internet infrastructure protection, in particular to the protection of the Domain Name System (DNS) via DNSsec. This important project is a collaborative effort with the Computer Security Division, and the panel believes that the focus on DNSsec is appropriate. While DNSsec has not enjoyed wide adoption to date, recent events have raised the general awareness about the need to protect shared

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

Internet services such as the DNS, and hence the impact of NIST’s work in this area should be higher in the future. The division has also begun a new project on evaluating the performance and scalability of IPsec key management protocols. This timely work should help inform the IETF’s ongoing effort to select a successor to the Internet Key Exchange (IKE) key management protocol. The panel also urges the division to be alert to potential new issues in high-performance IPsec extensions that are expected to arise in the next year or two.

As part of the NIST-wide initiative in quantum computing, the Advanced Networking Technologies Division is working with the Computer Security Division on protocols and prototypes for quantum cryptography. The Advanced Networking Technologies Division’s contribution, known as the Quantum Information Networks project, is in the area of key management protocols for quantum key distribution. The panel is pleased to see that this project has both a protocol design and a prototyping element using a real quantum channel. This project is associated with the IPsec key management protocols work mentioned above, and the panel expects that good synergy should be achievable across the two areas. While the practical impact of the Quantum Information Networks work is too far in the future to predict, having a few such long-term projects provides a good balance to the division’s overall research program. In addition, the division is able to contribute to a NIST-wide program, thus keeping researchers and management engaged in NIST’s overall mission.

In summary, the panel is very pleased with the division’s ability to sunset activities either because the stated goals have been accomplished or because technical innovations require a shift in focus. Programs concluded this past year include the development of reference implementations for IPsec, the MERLiN project, the broadband wireless work on IEEE 802.16, the 3G cellular work, and the active network project, which has been shifted to the technology transfer stage. The division has also demonstrated impressive agility and the ability to jump into an area early and to select work with significant potential impact. Between the 2001 and 2002 assessments, the division used project mergers and conclusion to move from nine projects to six (one of which is entirely new). This consolidation helps highlight synergies between activities and helps reduce the number of projects with just a few staff members working on them. The panel is also impressed with the closer collaborations that have developed between staff working on different projects; specific examples of effective cooperative pairings include that of optical restoration modeling and MPLS and that of sensor networks and MANET. Utilization of the expertise and results available in other groups within the division is a constructive way to leverage a project’s resources and maximize NIST’s impact.

Program Relevance and Effectiveness

Staff of the Advanced Networking Technologies Division continue to be active in a variety of industry organizations, including the IETF, the IEEE, and the International Telecommunications Union. NIST personnel are well respected by the staff of these standards organizations and by the communities they serve. The value of the division’s standards-related efforts are realized in several ways. Most often, technical work done at NIST, such as modeling and analysis or development of testing tools and evaluation criteria, provides a greater understanding of the implications of proposed standards or supplies solutions to problems that could arise in standards development. NIST’s familiarity with the networking community and its reputation for an unbiased technical approach are also useful in determining what issues have inspired the standards effort and in defining the technical space on which the standards bodies should be focusing. One example of recent impact is the division’s notable success at raising the visibility of co-interference problems between IEEE 802.11 and Bluetooth wireless networks. Current efforts with the potential for significant future impact include the work on SIP

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

interoperability testing, which is likely to help tighten the specifications and arbitrate interoperability disputes, and the work on understanding the dynamics of resource discovery protocols, which may help improve existing solutions such as UPnP and Jini, while potentially showing the benefits of more general and standards-based approaches such as Service Location Protocol.

In the 2001 assessment, the panel discussed at length the growing practice of industry to develop standards in consortia or other private grouping rather than through the traditional “open” approach of mainly utilizing professional organizations. The situation has not changed since last year, and the issue remains important. The panel, and the division, recognize that the “closed” system is somewhat antithetical to the NIST and governmental philosophy of supporting all U.S. companies and the public in an open manner. However, to carry out the NIST mission of strengthening the U.S. economy, the division must be able to impact the standards that will be used in the networking community no matter how they are developed. Therefore, NIST should develop a policy on this issue, together with criteria for deciding when and how to participate in these consortia. Some of the closed standards groups are actually very inclusive, with minimal burdens placed on participants; others may be designed to exclude potential competitors and should not be endorsed by NIST. Therefore, NIST should also consider whether it could develop a strategy for encouraging the IT community to continue to utilize open, or at least quasi-open, models of standards development.

The Advanced Networking Technologies Division assumes a leadership role in the networking communities, in part by virtue of the standards activities described above. However, it is important for the staff to build awareness of NIST’s expertise and to maintain its reputation in other ways. The division does publish in journals and conference proceedings, and its personnel attend a variety of meetings. These activities are highly appropriate, but the panel suggests that a larger presence in the more prestigious publications and conferences of the networking field might be appropriate. Stronger and more visible participation in the top tier in these areas would provide the widest dissemination and enable the greatest impact for NIST results. It would also allow the division to burnish its reputation, develop the reputations and visibility of its most respected staff, and position itself as a key element of the networking community.

Division Resources

Funding sources for the Advanced Networking Technologies Division are shown in Table 8.3. As of January 2002, staffing for the division included 24 full-time permanent positions, of which 20 were for technical professionals. There were also 10 nonpermanent or supplemental personnel, such as postdoctoral research associates and temporary or part-time workers.

The primary issue for the Advanced Networking Technologies Division is its limited number of full-time permanent staff. The division performs relevant and effective work, in part because of a large cadre of guest researchers (20 people as of February 2002). This heavy reliance on visitors means that the division depends on temporary employees to support the mission-critical projects, and the potential exists for unexpected delays or the premature termination of an important effort when a guest researcher leaves NIST. The panel believes that these risks are currently outweighed by the benefits provided by the added manpower and the relationships built with other institutions, but the division should continue to be careful about maintaining an appropriate balance between permanent and temporary staff. Similarly, caution should be exercised about the balance between internal and external funds.

The panel was pleased to observe that morale within the division is quite good and that the staff is enthusiastic about its work. This year’s transition in leadership (a new division chief) was accomplished very smoothly, with no disruption in focus or loss of momentum. Division and laboratory management

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

TABLE 8.3 Sources of Funding for the Advanced Networking Technologies Division (in millions of dollars), FY 1999 to FY 2002

Source of Funding

Fiscal Year 1999 (actual)

Fiscal Year 2000 (actual)

Fiscal Year 2001 (actual)

Fiscal Year 2002 (estimated)

NIST-STRS, excluding Competence

4.9

4.0

4.1

4.6

Competence

0.2

0.3

0.2

0.2

ATP

0.3

0.5

0.3

0.2

OA/NFG/CRADA

1.2

1.7

1.5

2.2

Total

6.6

6.5

6.1

7.2

Full-time permanent staff (total)a

30

27

21a

24

NOTE: Sources of funding are as described in the note accompanying Table 8.1.

aThe number of full-time permanent staff is as of January of that fiscal year, except in FY 2001, when it is as of March.

should fill the position of High Speed Network Technologies Group leader soon to allow the acting group leader to focus his attention on his new responsibilities as division chief.

Computer Security Division

Technical Merit

The mission of the Computer Security Division is to improve information systems security by:

  • Raising awareness of information technology risks, vulnerabilities, and protection requirements, particularly for new and emerging technologies;

  • Researching, studying, and advising agencies of IT vulnerabilities and devising techniques for the cost-effective security and privacy of sensitive federal systems;

  • Developing standards, metrics, tests, and validation programs to promote, measure, and validate security in systems and services, to educate consumers, and to establish minimum security requirements for federal systems; and

  • Developing guidance to increase secure IT planning, implementation, management, and operation.

The division’s programs directly support this mission and are consistent with the mission of the Information Technology Laboratory and of NIST. Privacy and security are essential to protecting electronic commerce, critical infrastructure, personal privacy, and private and public assets, so this work makes important contributions to strengthening the U.S. economy and promoting the public welfare.

The programs under way in the Computer Security Division are highly appropriate, and the division’s work has great technical merit. After a reorganization, the division is now composed of four groups: Security Technology, Systems and Network Security, Security Management and Guidance, and Security Testing and Metrics (the last two groups are new).

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

The Advanced Encryption Standard (AES) continues to be the focus of the Security Technology Group. In August 2001, NIST hosted a second workshop to continue to facilitate the analysis and development of new modes of operation for AES.8 NIST staff are also developing test and validation suites for the applications of AES. The open design of the AES, and the competition used to select it, have greatly enhanced the reputation of the division and are a model for future standards work.

The Systems and Network Security Group is working in a broad range of areas, including emerging technologies, reference data and implementations, and security guidance. One project is aimed at providing the technical support necessary to create a ubiquitous smart card infrastructure in the United States. Specific NIST efforts include the development of automated test suites and a testbed for the Government Smart Card Program, as well as the development of architectural models and security testing criteria. This work is appropriate because it will enable the development of consistent test methodologies for smart cards and will also reduce their cost and encourage their use in many areas. Another ongoing activity in this group is the ICAT Metabase, which is a searchable index of computer vulnerabilities that links users to a variety of publicly available vulnerability databases and patch sites. By integrating ICAT with other standard lexicons, such as the Common Vulnerabilities Enumeration, division staff have made this resource invaluable to industry as well as to researchers and users of computer systems.

Another activity in the Systems and Network Security Group is the work related to security on mobile devices, such as personal digital assistants (PDAs). The goal of this work is to develop new security mechanisms for wireless mobile devices so that the devices can be used as smart cards or computation devices to validate information about the possessor. While this objective is clearly appropriate for this division, the panel is concerned about the direction of the project and is not convinced that the work will prove fruitful. Securing hand-held PDAs is important, indeed critical, when the owner of the PDA is trying to protect information on the PDA. However, if the owner is untrusted, and a third party places information on a PDA that is to be used later as a security token (for whatever purpose), an untrusted party then has access to that information and can read, alter, and/or delete it. More specifically, securing the information would require that the PDA be a reference monitor, which it is not. The division should recast the work with this observation in mind.

A relatively new effort in the Systems and Network Security Group is a project on defending public telephone networks, including analysis of signaling system seven (SS7) vulnerabilities. The scope of this project is not clear to the panel. It might address basic communication security, application-specific security requirements for SS7, or both. The former area can be addressed via underlying communication protocol security mechanisms, such as the use of IPsec when SS7 is carried over IP. Concerns in the latter area are intrinsic to SS7 and have been studied by telephone companies in the past. For example, in the early 1990s, GTE Labs developed an SS7 “firewall” to protect central office switches. Before embarking on this project, division staff should be familiar with previous work in this area, not all of which has necessarily been widely published. Thus, the panel recommends that division staff contact the research groups that have worked in this area, such as the Verizon Technology Organization (which houses the former GTE Labs), Telecordia (formerly Bellcore), and the National Communication System technical staff, in order to become familiar with previous work in the area of SS7 security and countermeasures.

Work in the new Security Management and Guidance Group is entirely appropriate, and the group’s goal of advising and assisting government agencies is laudable. The panel felt that, on the whole, the

8  

A mode of operation, or mode, for short, is an algorithm that features the use of a symmetric key block cipher algorithm to provide an information service, such as confidentiality or authentication.

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

programs under way to implement these goals are very suitable for the division and have high technical merit. A primary focus is the Computer Security Resource Center, a valuable Web site that provides information about computer security for the public.9 This Web site is accessed by a wide range of organizations, including federal agencies, businesses, and schools, and exemplifies how the division can effectively make information available and accessible to a broad audience. The maintenance of this site is consistent with the division’s mission, as is the work on outreach to both federal agencies and businesses. These activities serve both to educate the public about computer security and to provide resources that the public can use.

One of the key programs in the new Security Testing and Metrics Group is the National Information Assurance Partnership (NIAP) program, which focuses on developing Common Criteria protection profiles and investigating issues related to the use of these profiles in developing security requirements for the federal government. The goal of this work is to enable quicker, more effective security evaluations and to standardize baseline security requirements for particular environments and products. In the past year, six Common Criteria testing laboratories have been accredited, and work is proceeding on the development of new protection profiles (such as for financial institutions). Fourteen nations have now signed mutual recognition testing agreements (recognizing the Common Criteria and the Common Criteria testing laboratories).

Another cornerstone of the Security Testing and Metrics Group is the work on cryptographic security testing and cryptographic module validation. A variety of efforts contribute to these projects, including work on public-key infrastructure (PKI) standards, components, and committees. One goal of this work is to enable businesses, citizens, and organizations to interact with the government over the Internet. Originally, the plan was to develop a single portal for this “e-government” program, but, unfortunately, the lack of funding in 2002 for integrating authentication into this project seriously weakened its viability.

The NIAP and the Cryptographic Module Validation Program are important components of the Security Testing and Metrics Group. While both programs are consistent with the group’s overall objective of improving the security and quality of IT products through development and use of metrics and tests, the panel feels very strongly that under no circumstances should the NIAP and the CMVP be merged. The CMVP has a quantitative focus and aims to provide automatic measures of compliance, while the Common Criteria (the basis for evaluations under the NIAP) is qualitative and is not susceptible to the quantification that the CMVP uses. The customers of the two validation processes, and the goals of those processes, differ significantly, and merging them would be detrimental to both.

Program Relevance and Effectiveness

The Computer Security Division’s activities are relevant to a broad audience, including hardware and software makers and users in industry, the federal government, academic and industrial researchers, and the public. The division develops standards and guidelines for cryptography and security implementations, produces tools and metrics for testing compliance and performance of security systems and products, and facilitates the development of new and more effective security techniques. Division staff disseminate their results through publications, presentations, advice to government agencies, participation on committees, and posting of tools, databases, and information on the Web. The result is enhanced IT security through wider availability of products that meet security standards.

9  

The Computer Security Resource Center is available online at <http://csrc.nist.gov/>.

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

One example of the impact of this division’s programs is the way that the CMVP improves the security and quality of cryptographic products. Of 164 cryptographic modules tested, about 50 percent had security flaws, and over 95 percent had documentation errors. Of 332 algorithm implementations submitted for validation, about 25 percent had security flaws and over 65 percent had documentation errors. Detecting these problems enables vendors and implementers to correct their products before the modules and algorithms are put into production and bought and used by consumers. This program is a sterling example of what the division is and should be doing to carry out its mission.

In addition to developing cryptography and security standards and tools, division staff are active in several national and international standards activities and in groups such as ANSI, ISO, and the IETF. The committees and activities of these organizations are examples of open standards development and adoption environments. In last year’s report, the panel discussed the issues related to the growing number of cases of industry’s developing and choosing standards in closed or exclusionary groups, such as some consortia. The panel emphasized that the division must take care not to endorse specific protocols developed by companies outside open standards environments. Instead, the division should aim to sanction only standards that resulted from open development processes, such as processes within inclusive standards organizations or at NIST itself. When flaws in a standard are found, the standard must be fixed, but the panel was concerned that exclusive standards processes make it easier for companies to argue against fixing the standard when the change would delay the deployment of a new product or interfere with products already on the market. NIST cannot be party to such behavior.

The panel continues to worry about the potential that such problems will arise with standards developed in closed settings, and it continues to recommend that only standards arrived at openly be endorsed by the Computer Security Division. This year, the panel’s concerns have broadened slightly, particularly in light of the deference occasionally shown by NIST to the vendor communities. The issue arose in the context of the division’s work on best practices, as well as in the division’s development of protection profiles for the Common Criteria. NIST can and should use industry standards and methodologies, but they should be standards and methodologies that have been carefully developed, thoroughly studied, and extensively tested, and it should be verified that the standards and methodologies are not merely appropriate but are the best with respect to the performance goals they are designed to produce. Thus, standards and methodologies from industry should not be adopted unless NIST verifies that these criteria are met.

Division Resources

Funding sources for the Computer Security Division are shown in Table 8.4. As of January 2002, staffing for the division included 48 full-time permanent positions, of which 42 were for technical professionals. There were also 17 nonpermanent or supplemental personnel, such as postdoctoral research associates and temporary or part-time workers.

The funding for the Computer Security Division is dropping dramatically in FY 2002, almost entirely because of the decision by Congress and the administration not to continue funding two large programs housed in this division: the Computer Security Expert Assist Team (CSEAT), which was funded at $3 million in FY 2001, and the Critical Infrastructure Protection (CIP) Research and Development Grants Program, which was funded at $5 million in FY 2001. In light of recent events, the panel is particularly appalled at the reduction in funding for computer security activities. This is a time of heightened concern about the potential vulnerabilities of the nation’s computers and networks, and the division has the technical expertise to make major contributions to the protection of computer and infrastructure systems. Congress and the administration should use and expand that expertise rather

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

TABLE 8.4 Sources of Funding for the Computer Security Division (in millions of dollars), FY 1999 to FY 2002

Source of Funding

Fiscal Year 1999 (actual)

Fiscal Year 2000 (actual)

Fiscal Year 2001 (actual)

Fiscal Year 2002 (estimated)

NIST-STRS, excluding Competence

5.9

6.6

17.6

9.7

ATP

0.2

0.3

0.4

0.0

OA/NFG/CRADA

2.3

1.4

2.1

1.1

Other Reimbursable

0.1

0.1

0.0

0.0

Total

8.5

8.4

20.1

10.8

Full-time permanent staff (total)a

48

43

40a

48

NOTE: Sources of funding are as described in the note accompanying Table 8.1.

aThe number of full-time permanent staff is as of January of that fiscal year, except in FY 2001, when it is as of March.

than ignore or reduce it. Legislation proposed in the spring of 2002 seems to ignore or bypass NIST, which is detrimental to the morale and wasteful of the expertise available in the Computer Security Division.

One risk from the reduction of the congressionally allocated internal (STRS) funding is that the division may be forced to rely more on external (OA/NFG/CRADA) funding. This type of support is usually tied to particular projects that are of interest to the outside funding agency. When the division has less control over the scope and direction of its overall portfolio of activities, morale suffers, and the staff’s ability to support the missions of the division, ITL, and NIST is compromised. Thus, the division should be careful to accept external money only for projects that contribute to, complement, and enhance the division’s ability to meet its overall objectives. The panel is very concerned about this issue, and it recommends that the division be careful to avoid obtaining external funding unless it is justified by reasons other than just keeping people on staff. The panel does note that the division chief has shown exceptional initiative in investigating other potential sources of income for the division, such as charging customers for cryptographic module validation. While a system for such fees has not been successfully determined, it is an appropriate avenue to explore; if the services provided by NIST were provided by a commercial testing company, the customers certainly would not get them for free.

In the discussion of Computer Security Division resources in the 2001 assessment report, the panel expressed concerns about two programs: CSEAT and the CIP Grants Program. While funding for NIST for both of these programs has been eliminated in FY 2002, before that time the division had been very responsive to the panel’s concerns, and the panel commends the division for its efforts. In the first area, the concern related to the focus of CSEAT. The panel recommended that CSEAT’s scope be retargeted to support training and education rather than maintaining its initial focus of developing a capability for penetration studies for other government agencies. The panel was pleased that this shift did occur, and that last year, CSEAT reviewed policies and provided guidance to federal agencies and high-risk computer security programs. Of course, without funding, these activities will cease. In the second area, the panel indicated its belief that the current level of funding for the CIP Grants Program (in FY 2001, $5 million was to be distributed in grants) was too small to have an impact. Numerous grants funded at

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

low amounts can provide neither the breadth nor the depth of research required to make meaningful contributions to the massive and complex problem of how to secure the nation’s critical infrastructure. The division was responsive to this observation, and the panel applauds the creative efforts made to increase the pool of funding by finding other agencies to cosponsor the work (one agency agreed to do so). However, the lack of any FY 2002 funding from Congress will end this program entirely.

Information Access Division

Technical Merit

The mission of the Information Access Division is to accelerate the development of technologies that allow intuitive, efficient access, manipulation, and exchange of complex information by facilitating the creation of measurement methods and standards. Through collaboration with industry, academia, and government, the division contributes to the advancement of these technologies, enables faster transition into the commercial marketplace, and enables faster transition into applications of division sponsors by coordinating and providing performance metrics, evaluation methodologies, test suites and test data, prototypes and testbeds, workshops, standards, and guidelines.

The Information Access Division is composed of four groups: Speech, Retrieval, Image, and Visualization and Usability. New activities and recent accomplishments in each of these groups are discussed below, together with any potential issues observed by the panel.

The Speech Group continues to work with the Defense Advanced Research Projects Agency (DARPA), the National Security Agency (NSA), and the spoken language research community in industry and at universities to develop metrics for evaluating state-of-the-art speech and speaker recognition systems and to coordinate benchmark tests within the community. The size and scope of the group’s efforts have grown in recent years, reflecting the increasing commercial interest in spoken language technologies. New activities in emerging areas include the recognition of conversational speech and its speaker over the telephone and the Effective Affordable Reusable Speech-to-text (EARS) project, which is sponsored by DARPA. The goal of NIST’s work on EARS is to develop and carry out performance evaluations of tools to produce transcription of speech that is substantially richer and more accurate than is currently possible. A key step will be in defining the “rich transcription” concept—that is, in defining what features (beyond the text of spoken words) are desired. Another new effort in the Speech Group is that of providing a development and evaluation infrastructure for the pervasive computing’s automatic meeting transcription project. This project produces a large corpus of audio and video recordings from meetings, and it is difficult to evaluate this material using current recognition technologies. The Speech Group’s work on new evaluation protocols, metrics, and software will make a substantial contribution to the pervasive computing work.

The Retrieval Group is involved in several projects designed to encourage research in and systematic evaluation of information management systems. The best known of these activities is the internationally recognized Text Retrieval Conference (TREC) series. The TREC Program has evolved from its early focus on traditional text retrieval and routing applications to considering much richer information access problems that are of interest to both commercial and government users. Recent focus areas include spoken document retrieval, video retrieval, question answering, and cross-language retrieval, which all continue the TREC tradition of enabling and driving the development of new capabilities. In the effort in video retrieval, which is the newest project in the Retrieval Group, the division’s work is pushing the frontiers of multimedia retrieval technologies and developing resources that will provide the

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

foundation for research for years to come. The availability of a testbed for component capabilities (shot detection) and end-to-end tasks (retrieval) will be instrumental in video retrieval research.

In question answering, the focus is on systems that can handle tasks such as returning lists of examples to a single question, interacting with the questioner in a dialog, and knowing when the system does not know the answer to the question. This work is designed to further extend the capabilities of existing systems, and it emphasizes the importance of providing information rather than documents to people. The interest in interactive systems for question answering provides an opportunity for the division’s Retrieval Group and its Visualization and Usability Group to collaborate on moving research and evaluation methods forward in the area of interactive systems; the panel looks forward to hearing about progress in this exciting new program.

In the cross-language retrieval area, TREC continues work on strategically important new efforts in Arabic cross-language retrieval. However, some of the other cross-language work has been transferred to other institutions. The European languages have migrated to the Cross Language Evaluation Forum, and the Asian languages have gone to NII-NACSIS Test Collection for IR Systems (both of these organizations are modeled much like TREC). The panel is pleased to see that this transition had occurred, as each organization is now taking the lead on the particular language in which it has the most expertise. Collaborations between the institutions continue, and the most efficient use of resources is occurring.

In addition to the TREC work, the Retrieval Group is also involved in metrics-based evaluation for two other important and challenging areas of information management. The Document Understanding Conference (DUC) focuses on document summarization techniques, which make it easier for people to quickly digest the ever-growing volume of information that they encounter. It is difficult to evaluate summarizing tools, and NIST is contributing to theoretically sound and practically useful assessment methods. Another information management activity is the new Advanced Question and Answering for Intelligence (AQUAINT) Program, sponsored by the Advanced Research and Development Activity. NIST’s work in AQUAINT will be on metrics and evaluation tools to drive research and development on techniques for drawing on unstructured and structured data in multiple languages and modalities to find answers to complex questions. This program is not specifically aimed at homeland security applications, but these sorts of techniques could certainly increase analysts’ ability to find useful information quickly. The division’s role, developing testbeds and evaluating the performance of systems on these more complex tasks, is critical to the overall success of the program.

The DUC and AQUAINT Programs do not yet have the visibility of TREC (in part because DUC and AQUAINT are currently small activities that can only host workshops of a limited size). For these programs to succeed and grow, the division must maintain its focus on the development of test collections and metrics to evaluate new technologies.

The Image Group continues its work on fingerprint databases and testing, as well as on the HumanID project, which deals with multiple and whole-body biometrics used for identification at a distance. The fingerprint work is largely funded by the FBI, and the division’s state-of-the-art projects in this group support the FBI’s world-leading fingerprint operations. Some of the latest accomplishments include work on standards for palm prints and high-resolution fingerprints, on ways to add demographic information to fingerprint files, and on facilitating the compatibility of U.S. formats with the implementations used by the United Kingdom and Interpol. Last year, the panel noted that the latent fingerprint workstation project in this area had reached maturity, and the division has responded by phasing out this work at NIST and transferring the expertise to an FBI contractor, who is successfully using this product to train fingerprint examiners.

One of the predominant current needs of the law enforcement world is interoperability standards for biometrics used by various government agencies, and the Image Group has a long history of contribut-

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

ing to data format establishment for biometric interchange. The Providing Appropriate Tools Required to Intercept and Obstruct Terrorism (PATRIOT) Act, passed in October 2001, requires NIST to help certify a technology standard to verify the identity of individuals entering the United States. The act specifically mentions fingerprint and face biometrics, in which areas the division has worked for many years, but the identity verification work may also involve human identification at a distance, a field that NIST copioneered just a few years ago. The Image Group’s role in responding to the PATRIOT Act is as a database provider and technology evaluator for counterterrorism tools, and the tasks related to this act place a significant burden on the group’s resources. Various activities are being planned, including work on certification procedures based on face verification tests previously done by this group for biometrics; the compilation of data sets, including NIST standard databases, DARPA HumanID databases, and other data supplied by government agencies; and the designing of test systems to evaluate performance in various scenarios. These efforts are important, and results are required within 1 year. The group is well positioned to achieve the objectives, owing to the fact that necessary tools, expertise, testing procedures, and databases are in place from its previous work in these areas. As of February 2002, plans for the necessary projects had been formulated and work had begun.

Another ongoing activity in the Image Group is the work on MPEG-7 standards for video and multimedia. Currently only one staff member is assigned to this project; his responsibilities include chairing relevant ISO/IEC committees, developing large MPEG databases and software tools, and hosting the Web sites to disseminate these products. New MPEG standards are currently being determined by the community, and an increase in the number of staff on this project might be needed to ensure that the new standards are technically appropriate.

The Visualization and Usability Group is involved in several efforts aimed at improving the usability of IT software and Web sites. Such improvements in usability can have a significant impact, since poor usability contributes to the high cost of ownership of software, and poor Web site usability results in misinformation, lowered efficiency, and lost opportunities. The Industry Usability Reporting effort and the Common Industry Format (CIF) for reporting cumulative user test results provide the infrastructure for facilitating the sharing of usability information between consumers and producers of software. NIST was instrumental in bringing together industry leaders in several working groups and in driving the effort to fast-track the new CIF standard, which lays the foundation for factoring usability in to software procurement decisions. The CIF test, evaluation, and report (CIFter) project will use the CIF standard to identify efficient and effective usability practices and to develop a benchmark against which new evaluation techniques can be compared. In related work, the Visualization and Usability Group is also involved in developing prototype software and coordinating working groups focused on remote and automated usability testing for Web sites.

Staff from several Information Access Division groups contribute to the ITL-wide effort on pervasive computing. The Pervasive Computing Program began 3 years ago, and NIST has now reached the point at which it can contribute substantially to this nascent field. Two important milestones were achieved in the past year. The first is the completion of a room for automatic meeting transcription in which SmartFlow software collects data from different multimedia input devices, and the second is the development of a general-purpose application programming interface (API) to run these devices.

Over the past 3 years, the panel has offered a series of recommendations on the pervasive computing efforts, mainly concerned with the balance between the creation of novel components and the development of a testbed that could be used to drive research in this field. The panel is very pleased with ITL’s responsiveness to its observations, and it notes that the program has progressed to be exactly in conformance with the ITL and NIST missions. Now that ITL has completed the basic infrastructure and

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

testing tools for pervasive computing research, it will be interesting to monitor the progress that the community makes in this field.

Program Relevance and Effectiveness

The ability to quickly and accurately analyze the ever-increasing amount of information that we all encounter is critical to individual and corporate productivity and to government effectiveness. In all of the Information Access Division’s groups, work is under way to provide developers and users of information management systems with the tools they need to measure and improve the performance of these systems.

In the Speech Group, NIST-administered benchmark tests have clearly contributed to the improvements in the capabilities of commercial automatic speech recognition products over the years. As the panel observed last year, the division’s tests provide a quantitative measure of performance (i.e., the accuracy) of speech recognition systems, and this measure enables technique developers to compare methods and efficiently make advances that will enhance product capabilities.

In the Retrieval Group, the TREC Program takes full advantage of NIST’s unique position as an impartial facilitator and evaluator of work to drive research and development of new information retrieval technologies. External involvement in TREC continues to grow; attendance was up 25 percent in 2001, and 35 percent of the participating groups were from industry. Commercial products often utilize ideas and systems first developed in the context of TREC. The program’s success is due in part to its continuing evolution; it continually concludes work on tracks where the impact of NIST’s benchmarks are diminishing and starts up new programs in emerging areas, such as cross-language retrieval, multimedia retrieval, and question answering.

The impact of TREC can be seen in various ways. The first is the interactions between participants that occur at the annual workshops, where university, industry, and government groups are all in attendance. One of TREC’s critical functions is to drive commercial development in areas where government has information management challenges, and past and current programs (such as spoken document retrieval, topic detection and tracking, and multilingual question answering) have demonstrated NIST’s success at this task. Another key outcome of TREC is the use of the databases and relevance assessments in a great deal of ongoing experimental work. For example, more than 50 percent of the papers at the most recent ACM Special Interest Group for Information Retrieval Conference used data from TREC to evaluate their systems.

Last year, the division and the panel were questioned by NIST management as to the ongoing relevance of the TREC Program, which began in the early 1990s. The panel examined the current activities in TREC, as it does each year, and in the 2001 assessment report carefully spelled out how TREC has evolved significantly over time and how it continues to provide critical databases, evaluation software, and metrics for industrial, academic, and government institutions to use in developing new information management technologies. In light of this reassurance that TREC continues to provide new and necessary tools to NIST’s customers, NIST management has ceased pressuring the division on this question. The panel commends NIST for its responsiveness and its trust in the peer assessment process.

In the Image Group, the work required by the PATRIOT Act on developing and certifying a technology standard for identity verification will certainly have an impact on governmental use of biometric technologies. The homeland security applications of this work bring opportunities and new challenges. First, they give the division a chance to affect agencies outside its traditional law-enforcement-community customers, as these technologies may be used for border security, for identification cards for airport employees, and for drivers’ licenses. However, these very applications make the

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

profile of the work much higher than the division is accustomed to, and they bring the NIST activities under scrutiny from technical, political, and privacy groups.

While the increased work on biometrics may have security and governmental origins, it is critical that the division not lose sight of the opportunity provided to influence the commercial human identity technology arena. The biometric industry will benefit immensely from the establishment of proper testing and measurement procedures, which can drive the development of reliable products and increase the industry’s credibility with potential users.

Usability is an important and often-neglected component of software and Web site design, and the Visualization and Usability Group has been instrumental in drawing attention to this issue and in providing the tools necessary to improve the situation. The CIF and CIFter Programs are focal points that bring together creators and consumers of information technologies. The CIF Program focuses on working with industry groups to identify techniques for effectively incorporating usability into software procurement decisions. The new CIF standard10 provides a foundation for exchanging usability information and is already being used for procurement decisions by large enterprises. The CIFter and Web Metrics efforts are focused on improving the methods employed for evaluating the usability of IT products and Web sites.

The results of the Visualization and Usability Group’s work are disseminated in a variety of ways. Staff are active in efforts on usability standards at the American National Standards Institute/National Committee for Information Technology Standards and the World Wide Web Consortium (W3C) and in usability conferences, such as the annual Conference on Human Factors and the Web and the ACM Conference on Human Factors in Computer Systems. NIST also sponsors workshops, staff participate in working groups, and the division collaborates with government customers (such as the National Cancer Institute). These activities all make important contributions to improving usability methods and practices in the field.

The division’s recent completion of a room for automatic meeting data collection and perfecting of an API for pervasive computing devices will enable staff to produce important tools for the pervasive computing research community. One output is a collection of multimedia databases of meetings produced by the automatic meeting transcription project. These databases are huge (60 GB of data for each hour of meeting time) and would be difficult for any individual research group to compile on its own. NIST’s compilation and dissemination of these databases will facilitate research and improve the state of the art by enabling comparison testing of various products and techniques. The second key division product is the API itself. The API, which came out of Smart Flow data capture work, is a critical tool for efficiently setting up multimedia and pervasive computing laboratories. This API will allow researchers to connect cameras and microphones and other equipment without building their own baseline software each time. Thus, researchers can focus directly on experiments in their laboratories, and comparative testing on a common platform will now be possible. NIST’s work in pervasive computing is appropriate, and its results are of great interest to the relevant research community. A workshop hosted by NIST in May 2001 attracted representatives from groups from around the world.

The Information Access Division effectively disseminates its results to a wide variety of customers. Eight workshops and conferences were organized by division staff in 2001, and these meetings were attended by representatives of many respected academic and industrial organizations. In addition, the proceedings from these conferences are often published by NIST and made publicly available on the

10  

This is American National Standards Institute National Committee for Information Technology Standards (ANSI/NCITS) 354-2001.

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

Internet. Test sets, databases, and APIs produced by the division are frequently downloaded from the Web, and these packages, related to biometrics, speech, information retrieval, and usability, have been circulated widely. Division staff also give presentations, participate on standards committees, and publish papers in journals, conference proceedings, and other publications. Roughly one-fifth of the division’s 27 publications in 2001 appeared in archival journals; the panel encourages staff to explore whether more papers could appear in these sorts of journals, to supplement the publications in conference proceedings, which may be less widely available to potential NIST customers.

Division Resources

Funding sources for the Information Access Division are shown in Table 8.5. As of January 2002, staffing for the division included 40 full-time permanent positions, of which 35 were for technical professionals. There were also 6 nonpermanent or supplemental personnel, such as postdoctoral research associates and temporary or part-time workers.

The level of resources has been adequate for most projects in the Information Access Division. A very high percentage of the division’s funding comes from outside sources—currently support is provided by several agencies, including DARPA, AROA, the FBI, NSA, and the National Institute of Justice. In the past, the panel has expressed concerns about the relative balance between internal and external support, but it does recognize that division management is aware of the potential risks associated with dependence on outside funding. The panel also believes that the ongoing work for other agencies is appropriate for the division and that it serves to advance the NIST mission, particularly as homeland security is one of NIST’s new Strategic Focus Areas.

The growing emphasis on activities related to homeland security and the new burdens imposed by the PATRIOT Act may strain the current resources allocated to this division. If new staff time and funding are needed to meet the goals in these areas, the panel hopes that resources can be provided from new allotments of congressional funding and from other government agencies rather than from reallocation. ITL and division management should be wary of the impact of these programs on other division

TABLE 8.5 Sources of Funding for the Information Access Division (in millions of dollars), FY 1999 to FY 2002

Source of Funding

Fiscal Year 1999 (actual)

Fiscal Year 2000 (actual)

Fiscal Year 2001 (actual)

Fiscal Year 2002 (estimated)

NIST-STRS, excluding Competence

4.5

4.6

4.8

4.6

Competence

0.0

0.0

0.0

0.1

ATP

0.2

0.1

0.0

0.0

OA/NFG/CRADA

3.2

4.0

4.5

4.4

Total

7.9

8.7

9.3

9.1

Full-time permanent staff (total)a

40

39

39a

40

NOTE: Sources of funding are as described in the note accompanying Table 8.1.

aThe number of full-time permanent staff is as of January of that fiscal year, except in FY 2001, when it is as of March.

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

activities, whose progress could be delayed or curtailed as a result of pressure from the homeland security work.

The human resources available to the Information Access Division are quite good. The division gains access to fine and relatively inexpensive researchers by offering temporary positions to European students and postdoctoral research associates. Unfortunately, the division is seldom able to hire even the best temporary employees to permanent positions, as slots seldom open up. However, the low turnover among the permanent staff does testify to the high morale and good working environment that clearly exist in this division.

Convergent Information Systems Division

Technical Merit

The mission of the Convergent Information Systems Division is to conduct research and development into integrated systems, architectures, applications, and infrastructure for the exchange, storage, and manifestation of digital content and to explore their scalability, feasibility, and realization for new applications.

In line with its mission statement, the Convergent Information Systems Division continues to provide industry with important standardization and testing services for the exchange, storage, and manifestation of digital content. The projects under way examine technologies and investigate the issues that arise when combining these technologies. Division staff then focus on producing tools and knowledge to facilitate pulling complete systems together.

The Convergent Information Systems Division was formed in October 2000. During 2001, its first full year of operation, the division won numerous prestigious awards, and the panel commends the division for the ongoing evolution of its projects—through shifts in focus within broader program areas, through starting up entirely new activities, and through the conclusion of completed projects. The division has two organizational units, the Distributed Systems Technologies Group and the Information Storage and Integrated Systems Group, and work during 2001 occurred in approximately seven program areas. Below the panel discusses objectives, highlights, and issues in six of these ongoing programs: Digital Cinema, Biometrics, Optical Disk Storage, Trust Management/Digital Rights Management, E-Books and the Braille Reader, and Interactive Digital Television. Work in the seventh area, Cluster Computing, is concluding this year.

In the Digital Cinema Program, the focus is currently on content manifestation and consumption. The division has set up a facility that industry groups and vendors can use to test parts of the digital cinema chain. The goal is to provide industry with test and measurement services for digital cinema acquisition devices, transmission facilities, storage devices, and display systems. One recent accomplishment is the development of a computer application that can preview how compressed MPEG video appears on displays with various color bit depth; such an application can allow a producer or editor to make decisions about the amount of compression in light of the equipment likely to be used by the target audience. Another activity is the establishment of a multimedia editing system that takes input from a variety of sources, allows audio and video editing, and can output the results in any digital or analog format. This system allows division staff to research digital content types on a low-cost, PC-based editing system and provides broad flexibility in achieving the goals of content interoperability. Using this system, the division is testing the interoperability of different digital content types, developing tools for interoperability use, and studying performance issues related to content transfer and delivery (e.g., download times for enhanced content). The system will also enable staff to examine the usage of

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

recombinant media types, such as mixed video, text, and audio, for specific digital applications (e.g., electronic learning) or for content compatibility with writable storage media (e.g. DVD/CD R, RW).

The division continues its long-standing work in biometrics. This past year, the division facilitated the creation of two critical standards for biometric technologies: the Biometric Application Programming Interface (BioAPI, version 1.0) and the Common Biometric Exchange File Format (CBEFF). BioAPI provides a common program interface for biometric devices and was approved as an ANSI standard in February 2002. This interface enables programmers to write code that can be used with multiple capture devices or with back-end minutia extraction routines. The BioAPI specification technology is critical in allowing biometric devices to have a common look and feel. The CBEFF, which was published in January 2001,11 provides a standard biometric data format that facilitates data interchange across different applications and devices. Currently, the International Biometric Industry Association is managing the registration of CBEFF format owner and format type values. The division’s work has been critical to the success and growth of the biometrics industry. The panel notes that there are several ways in which NIST could continue to assist the biometrics research and development community in improving its products. Particularly helpful would be standardized mechanisms for determining false acceptance and false rejection rates, for evaluating and classifying various recognition algorithms and software, and for defining standard threat and usage models to enable effective definition and evaluation of implementation and deployment requirements.

Another long-standing effort is the work in the Optical Disk Storage Program, which includes a number of individual projects. A facility has been built in which the lifetime of data stored on digital versatile disks (DVDs) can be tested. Also, staff have developed an application software package to test the compliance of DVDs and compact disks (CDs) with the CD-Multi standard. This package is the only software available to companies and consumers for checking whether their drives are compatible with the disks on the market. Future activities in the Optical Disk Storage Program include the construction of a laboratory to test interoperability for optical jukebox storage units. No interoperability among the various storage systems currently exists, and the division hopes to use this new facility to help the industry develop interoperability tools. The laboratory will be a joint project with the High Density Storage Association (HDSA), and a CRADA has been put in place between the division and HDSA. All of these projects contribute to the division’s ability to take an active role in the DVD community. With the DVD Association, the division is cosponsoring a June 2002 conference on DVD standards, technologies, applications, and use for homeland defense.

The Trust Management Program is continuing its evolution toward a focus on digital rights management (DRM) and content encapsulation. Work continues on financial agent secure transactions (FAST); industry recently endorsed the division’s efforts to examine the relevance of this trust model to e-commerce exchange by small electronic manufacturers. In DRM, the division has hosted two workshops to bring together the providers of content (such as movies, music, and books) that requires digital content protection with the makers and proponents of various technologies whose purpose is to supply this protection. The panel believes that these workshops should be an opportunity for NIST to obtain commitment from various industry content providers and technology providers to cooperate in the creation and use of a truly interchangeable format. If these stakeholders agree to this goal and if they require NIST’s assistance in the standardization of rights management for digital content, then the

11  

Fernando L. Podio, et al., U.S. Department of Commerce, Technology Administration, National Institute of Standards and Technology, Common Biometric Exchange File Format, NISTIR 6529, National Institute of Standards and Technology, Gaithersburg, Md., January 2001. Available online at <http://www.itl.nist.gov/div895/isis/cbeff/CBEFF010301web.PDF>.

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

Convergent Information Systems Division should build a more substantial program in this area. Since the division does not currently participate in DRM activities, it may be particularly suited to bring consensus among the participants in this contentious area.

The division’s work on electronic books (e-books) brought together diverse groups from the industries interested in electronic publishing. However, in the end these stakeholders did not agree to develop and use truly interoperable hardware formats that would allow books in various content formats to be read using software from different vendors. The division has managed to create a common format within which various proprietary book formats may be placed for easier exchange across platforms. Given the industry’s unwillingness to truly embrace interoperability, the panel believes that this program either should be concluded or should be redirected toward work in the broader area of electronic publishing, encompassing issues related to hardware interoperability, assistive technologies, and sources of content capture (e.g., digital cameras). Such activities would complement the division’s new focus on content packaging and formatting and content manifestation and consumption (the DRM work). The goal would still be achieving true interoperability between book formats and reader software in the market. However, continuation and redirection of this program should only occur if the market for electronic books appears to pick up, with the development of new hardware and the making of new commitments by publishers.

The division’s work on digital television (DTV) is winding down. NIST has provided a significant service to industry by releasing the first public and free implementation of the Java APIs, which are used as the Advanced Television Systems Committee’s (ATSC’s) interactive DTV middleware standard. However, the broadcast market for interactive DTV is not taking off as many had expected. The panel therefore supports the division’s decision to redirect resources in new directions, such as middleware for biometric systems. The expertise gained in middleware development from the DTV application software environment (DASE) project will be crucial for examining the issues related to the integration of biometric point solutions into robust system architectures. For example, the deployment of biometric systems for homeland security will require that the middleware environment be tested for performance and that standard system evaluation methods be developed.

Overall, the panel commends the willingness of the Convergent Information Systems Division to refocus and to conclude programs. The division has been responsive to recommendations made in last year’s assessment, as can be seen in the conclusion of some projects (such as time synchronization, DASE, and the Braille Reader) and the increased emphasis on others (such as biometrics and data preservation). New activities, such as the work on rights management for digital content, have been built on existing programs, and entirely new projects have been started, such as the quantum information testbed. In this last activity, the Convergent Information Systems Division has taken over supervision of the facility, which is a joint project between ITL and the Physics Laboratory. The facility is supported by DARPA, which toured the facility in February 2002 and is pleased with the progress being made on this work.

Program Relevance and Effectiveness

The Convergent Information Systems Division reaches out to a diverse group of customers through various mechanisms. The staff make information about the division’s results, products, and activities available through publications and through the recently overhauled divisional Web site. Presentations are given to fellow researchers at conferences, to industry representatives at workshops, and to the public at schools and science fairs. The staff collaborate with researchers in other units and at other

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

institutions; there are eight ongoing collaborations with personnel in other ITL divisions or other NIST laboratories and four collaborations with university researchers. The division also regularly gives tours of its facilities and has filed for patents (one in 2001) when necessary. The division gathers a variety of data on all of these dissemination efforts, and the panel suggests that the metrics used could be further refined. For example, in the area of Web hits, appropriate additional questions might include who downloaded what, how and why they used it, and what benefits were realized.

The diverse array of outreach activities enables the division to connect with customers in its three main audiences: the government, industry, and the public. Although the division’s primary focus is on industry, several of its projects are particularly relevant to other government agencies in light of the events of September 11. NIST’s work on defining biometric standards will be heavily leveraged as the country looks at deploying mechanisms to identify terrorists and track terrorist activity. The division brings particular expertise with respect to the systems-level issues for using biometrics, and it has two relevant and newly funded projects—Biometrics Systems Integration and Efficient Infrastructures for Biometrics. The Convergent Information Systems Division’s efforts in this area will complement the work under way in the Information Access Division, which has been tasked under the recent PATRIOT Act with evaluating biometric technology for border and visa control. Even prior to the emphasis on homeland security, the Convergent Information Systems Division’s biometrics programs were relevant to various government customers, and division results are being utilized already in the Department of Defense’s Biometric Management Office and the General Services Administration’s Common ID card project.

Another division effort with potential homeland security applications is the work on data preservation in the optical disk storage area. The division’s standards and tests for determining the lifetime of data stored on various disks and for quantifying the influence of environmental factors on disk performance will provide crucial information for determining strategies and systems for disaster mitigation and recovery efforts. Relevant current projects include the development of codes for data recovery from optical disks and the construction of the NIST-HDSA interoperability test facility. As with the biometrics projects, the Optical Disk Storage Program also has governmental customers outside homeland security applications. Division staff currently serve on the Advisory Board for the Library of Congress. This board has been tasked by Congress to develop a national strategy for data preservation of digital content, which is a $175 million effort. As a result of inputs by NIST, the National Science Foundation and the Library of Congress sponsored a workshop in April 2002 on developing a national agenda for data preservation. The workshop assembled experts from around the country, from academia, government, and industry. The division headed the session on data preservation tools and technology.

The Convergent Information Systems Division has a close and productive relationship with companies in a variety of industries. The main form of interaction is through industry and professional associations. For example, NIST leads the Biometric Interoperability, Performance, and Assurance Working Group of the Biometric Consortium. This group includes 85 organizations and has three key projects: performance test procedures, CBEFF format for smart cards, and biometric assurance requirements. The value of the biometrics work to industry was roughly quantified in a recent economic impact study by Business Process Research Associates, which cites the division’s efforts in biometrics as having delivered up to $136 million in benefits through its work to consolidate two competing standards organizations and its facilitation of the adoption of both BioAPI and CBEFF. The value of the division’s work to industry is also made clear by testimony from various commercial groups. In the DASE project, the appreciation of the industry was expressed in a letter from the executive director of the Advanced Television Systems Committee. In the Digital Cinema Program, the work on color characterization for

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

display technologies received a warm testimonial from the Society of Motion Picture and Television Engineers. In the trust management area, the Financial Services Technology Consortium (a consortium of banks and lending institutions) is endorsing and using NIST’s efforts on FAST.

In all of the division’s programs, the ultimate beneficiaries are the users of the various systems made by the division’s industrial partners and customers. NIST’s standards and tests and interoperability tools all help industry improve its products. In some projects, however, the primary beneficiary of the division’s work is the public. The Braille Reader developed by division staff has won two prestigious innovation awards: the R&D 100 Award given by R&D Magazine and the Top40 Socially Conscious Innovations Award given by ID Magazine. However, industry has not adopted this technology, and no company seems to be looking to deploy or produce a Braille Reader at this time. Therefore, the division is working with the National Federation for the Blind on ways in which the federation might take this assistive technology forward into mainstream production.

Despite the effectiveness of division interactions with governmental and business customers and the benefits that consumers experience from the work done at NIST, an issue still exists with respect to communication to the world at large. For example, the division’s work on disk lifetime characterization and on CD and DVD compliance testing needs more exposure and publicity if consumers (individuals and companies) are going to understand the benefits and make use of the information when making purchasing decisions. The multiread standard is used by manufacturers of disks and equipment in the CD and DVD industry, but consumers, if they were aware of the implications of the CD-Multi specification and logo, could use the information to judge compliance of drives and disks. This information is especially useful in light of the upcoming fragmentation of the market in DVD-recordable drives and disks.

Division Resources

Funding sources for the Convergent Information Systems Division are shown in Table 8.6. As of January 2002, staffing for the division included 14 full-time permanent positions, of which 12 were for technical professionals. There were also 18 nonpermanent or supplemental personnel, such as postdoctoral research associates and temporary or part-time workers.

The small number of permanent staff (14) in the division is supplemented by a large cadre of guest researchers (6) and students (12). While this approach has some risks, particularly with regard to the continuity of programs, these temporary employees greatly expand the capabilities available to the division and provide flexibility. When the division had to replace a permanent staff member in 2001, the recruitment and hiring process for this type of position took an inordinate amount of time.

The Convergent Information Systems Division is unusual in ITL in that laboratory experiments are carried out by division staff. Thus, making sure that staff have access to up-to-date equipment is vital to maintaining their credibility as systems engineering experts tackling problems related to current industry needs. For example, the work on biometrics and related access authentication usage and models requires samples of the relevant technology and equipment. The same need exists in the Digital Cinema laboratories. Testbeds for industry products or software must contain the products and software being used in industry.

The division is tightly packed into the space currently allocated for its work. However, the division conducts more than 30 tours a year of its laboratory facilities, and, through a close relationship with the NIST Office of Public and Business Affairs, these laboratories provide ample public relations value to NIST. Thus, any contraction or relocation of divisional facilities would negatively impact staff productivity, their ability to demonstrate cutting-edge technologies to industry and other government agencies, and NIST’s promotional efforts.

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

TABLE 8.6 Sources of Funding for the Convergent Information Systems Division (in millions of dollars), FY 1999 to FY 2002

Source of Funding

Fiscal Year 1999 (actual)

Fiscal Year 2000 (actual)

Fiscal Year 2001 (actual)a

Fiscal Year 2002 (estimated)

NIST-STRS, excluding Competence

2.6

2.2

2.3

2.3

STRS—supercomputing

9.9

10.0

0.5

0.0

ATP

0.6

0.8

0.6

0.8

OA/NFG/CRADA

0.4

0.8

0.3

1.0

Other Reimbursable

0.0

0.8

0.0

0.0

Agency Overhead

6.7

7.4

0.0

0.0

Total

20.2

22.0

3.7

4.1

Full-time permanent staff (total)b

75

81

13b

14

NOTE: Sources of funding are as described in the note accompanying Table 8.1.

aThe significant difference between the FY 2000 and FY 2001 funding and staff levels reflects the reorganization of the Information Technology Laboratory, in which the information technology service groups and the Scientific Applications and Visualization Group were moved out of this division to the Information Services and Computing Division and the Mathematical and Computational Sciences Division, respectively.

bThe number of full-time permanent staff is as of January of that fiscal year, except in FY 2001, when it is as of March.

Information Services and Computing Division

Technical Merit

The mission of the Information Services and Computing Division is to provide an efficient, effective, and secure NIST IT environment and advance the utilization of IT to empower NIST and its stakeholders to accomplish their mission with maximum impact. The vision of the division is to be a vital partner in the accomplishment of NIST’s mission through the provision of premier IT services.

The division’s current responsibilities include supporting administrative and scientific computer systems, deploying and maintaining telephone and data networks, and providing high-performance computing capabilities and data storage. These IT service activities were consolidated into one division in October 2000. Since that time, division staff have worked hard to define the role of an IT support unit—what their responsibilities are and how best they might provide those services to the rest of NIST. Using a consultant, the division has developed a blueprint describing an architecture for IT at NIST; this plan should serve NIST well in the future.

In February 2002, the newly appointed NIST director announced that the Information Services and Computing Division would soon be subsumed into a new, institute-wide IT services organization. This organization will be run by a chief information officer (CIO), who will report directly to the director of NIST. The review panel applauds this decision, as the shift is appropriate in light of the increasing strategic importance of computing and communications in the research and administrative functions of NIST.

Despite the fact that the Information Services and Computing Division was created less than 2 years ago and despite the new uncertainties raised by the recently announced plans for another major organi-

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

zational shift, the division has made significant progress since the previous assessment. The panel is impressed by the energy and seriousness with which the division responded to recommendations in the 2001 assessment report and by the improvements made in a number of areas.

A prime example of the division’s responsiveness to the panel’s concerns is the work done on getting NIST connected to Internet 2. Last year, the panel had been troubled by the poor connectivity available to NIST researchers. This year, funding has been secured to support NIST’s joining an Internet 2 consortium, and a plan is set to get the connection in place by October 2002. The commodity Internet connection will remain in service, as this link will still be required to exchange data with entities not served by Internet 2. The panel applauds this effort, as access to Internet 2 is expected to enhance NIST researchers’ ability to collaborate with leading universities. As the division moves forward on this task, a key challenge will be promoting and supporting Internet 2 within NIST. The division should not assume that this higher-performance, wider-area networking capability will be instantly embraced or effectively utilized by research staff. Experiences at universities already connected to Internet 2 suggest that many researchers will not understand either the short-run or longer-run implications of Internet 2, and education and training will be necessary to convince them of the benefits and eventually to produce the expected increases in research productivity.

While rolling out support for Internet 2, the division also might want to explore whether NIST could also get connected to the national high-performance grid. In grid computing, high-performance networks, sophisticated management software, and other middleware enable sharing of remote high-performance computing systems, instrumentation, and data collections. This approach could be a useful way to provide better, more efficient high-performance computing to NIST researchers in the future. At the present time, however, the panel is pleased to note that the division has upgraded the existing high-performance computing platforms.

A variety of important advances occurred in the past year in the array of services that the Information Services and Computing Division provides NIST. One of the most noticeable and exciting changes was the creation of a central IT help desk. Facilities were constructed in Gaithersburg and Boulder, and the service was officially launched in February 2002. The scope of the help to be offered is broad; people’s questions on PC support, Web services, scientific computing, computer security, networking, telecommunications, administrative applications, and Unix and NT servers will all be directed to the help desk staff. The comprehensive nature of this centralized service is important, as the goals of creating it include reducing customer confusion about what services the division provides and about who can help them. Another potential benefit will be a uniform system for tracking service requests and generating service performance metrics.

An important responsibility of the Information Services and Computing Division is related to the security of NIST’s computer systems. Progress has occurred in this area in the past year, but issues still remain. Steps forward include the deployment of an E-Approval system, which prepares NIST to comply with the Government Paperwork Elimination Act that goes into effect in 2003. The system is based on a NIST-wide public-key infrastructure, which also supports digital signatures and e-mail encryption.

The panel believes that it is important for the division to push forward with its plans to upgrade network and systems security at NIST. A major current concern is systems administration, which is distributed to people throughout NIST and is acknowledged by staff to be out of control. For example, no way currently exists to comply with the Department of Commerce directive to terminate staff accounts within 24 hours of separation. Larger audit requirements are also not being met.

Another area of potential improvement is that of business continuity planning. While a disaster recovery approach is in place with a contractor (Sungard), the panel believes that the plan could be

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

updated. The current plan does not appear to reflect or accommodate the increasingly computer- and data-intensive research and administrative work performed at NIST. Also, the plan should take full advantage of the fact that NIST has two physical campuses (Gaithersburg and Boulder), and the option of using distributed data centers at these sites should be considered.

Program Relevance and Effectiveness

The Information Services and Computing Division serves customers throughout NIST, and NIST could not function efficiently or turn out the high-quality research discussed throughout this report without the support of this unit. Not only do the division staff maintain the telephone and networking infrastructure and support the 9,100 desktop PCs at NIST,12 but they also provide needed capabilities to individual units, such as administrative applications (e.g., for accounting) and high-performance computing capabilities (e.g., for scientific modeling programs).

The division has made a number of efforts this year aimed at saving time and money for NIST staff in their IT activities. One example is the launching of a NIST-wide PC-buying service, which will be extended to Unix workstations later this year. Centralized purchasing of PCs (and workstations) will produce several benefits both for the division and for NIST staff as a whole. Some of the pluses are reduced administrative and procurement costs, uniform security setups on all new PCs, and easier installation of new technologies and software. For example, by having IT services staff determine standard configurations of PCs and workstations that are compatible with NIST’s security and administrative requirements and having these staff manage the migration and installation of the new systems, NIST researchers will save time previously devoted to researching systems and transferring their work.

The division is focused completely on serving the needs of its customers—NIST personnel—which is appropriate. However, the panel believes that the division might benefit from reaching out to other organizations that support active IT-intensive research. Many research universities, other national laboratories, and corporate research and development entities face challenges similar to those at NIST: diverse platforms and widely varying technical expertise in their user communities. Relationships with these organizations might provide information about best practices, alternative solutions, and lessons learned that the division could productively apply to IT services at NIST. One possible way to engage with these institutions would be through memberships in relevant associations, such as Educause, the premier IT management association serving higher education and related entities, and the Global Grid Forum, a large collection of individual researchers and practitioners working on distributed computing, or grid technologies.

Division Resources

The panel believes that for the Information Services and Computing Division to move forward with many of its plans for improving the IT environment at NIST, it must proactively engage the NIST research community. While the division is the primary source of IT expertise and support, some of the responsibilities for IT services are dispersed throughout NIST, as is the case in many research universities. This situation has positive and negative aspects. The negative aspects include the costs, in both

12  

The total number of desktop computers breaks down approximately as follows: PCs, 7,500; Suns, 600; SGIs, 200; and Macs, 800.

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

money and efficiency, of inconsistent or duplicative efforts—for example, the distributed systems administration, which is weakening computer security at NIST. Another example is contracts with external providers of services or software, where each individual unit negotiates its own rates and terms. The panel is pleased to see that the division is working very hard to fix both of these situations. The positive aspect of IT services being provided in the units is the presence of support staff who work closely with a specific set of researchers and understand deeply and well their needs and constraints.

To take full advantage of the positive aspects of dispersed IT support personnel, the panel suggests that the division consider creating numerous separate, and perhaps unique, service-level agreements with the laboratories and research groups rather than a uniform approach to diverse NIST-wide needs. A range of agreements might reassure the laboratory staff that the division recognizes that each group has its own special needs, and then it might make the researchers more understanding of the instances in which some degree of uniformity is necessary (e.g., in security). Another path to encouraging conformance to standards and use of centralized services is that of making the centralized services more attractive than the alternatives. When the central mail services were improved recently, more people began using the main mail servers, and many of the small independent, unsupervised mail servers in the individual laboratories were able to be shut down. Security improved, and the amount of time spent NIST-wide on supporting e-mail services decreased. The recent launch of a centralized PC-buying service is another example of providing incentives for researchers to embrace a more efficient and uniform system. In the future, the division might consider consolidating server and storage functions so as to both improve responsiveness to the laboratories’ research agendas and reduce NIST’s overall costs and risks of lost data.

The panel is pleased to learn that the new IT services unit will include a group focusing particularly on providing solutions to assist researchers in tackling unique scientific problems. The goal would be to help NIST staff in the laboratories utilize commercial off-the-shelf products relevant to their experiments and perhaps to create an “explorers group” that would investigate new software and hardware that might be applicable to NIST research. The panel supports this approach and hopes that it will help the IT support group engage the NIST research community in creative thinking about IT solutions and allow the scientists to realize the value that IT can add to their experimental work. It was not clear to the panel if the current structure through which the division receives advisory input from the laboratories is actually providing (or being perceived as providing) optimal opportunities for two-way communication. It is also possible that communications and relations between the IT services unit and its customers might improve with the increased stature of IT services that may result from its organizational shift from laboratory division to individual unit whose head reports directly to the NIST director.

The panel notes that one consequence of the organizational change will be that the IT services are no longer reviewed by the National Research Council assessment panels, which review the programs under way in the NIST Measurement and Standards Laboratories, such as ITL. External assessments of programs drive self-evaluation as well as providing unbiased advice from different perspectives, and the panel recommends that NIST management explore ways for the new unit to receive this sort of input from outside the institution.

Funding sources for the Information Services and Computing Division are shown in Table 8.7. As of January 2002, staffing for the division included 136 full-time permanent positions, of which 109 were for technical professionals. There were also 11 nonpermanent or supplemental personnel, such as postdoctoral research associates and temporary or part-time workers.

In last year’s report, the panel encouraged division and laboratory management to look for ways to increase diversity at the management level. This year, as part of a reorganization of the division’s groups, competitions for several top-level positions were reopened in search of candidates, to bring new

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

TABLE 8.7 Sources of Funding for the Information Services and Computing Division (in millions of dollars), FY 1999 to FY 2002

Source of Funding

Fiscal Year 1999 (actual)

Fiscal Year 2000 (actual)

Fiscal Year 2001 (actual)a

Fiscal Year 2002 (estimated)

NIST-STRS, excluding Competence

0.6

0.9

0.3

0.2

STRS—supercomputing

0.9

0.9

7.4

9.1

ATP

0.0

0.0

0.1

0.2

OA/NFG/CRADA

0.0

0.0

0.6

0.7

Other Reimbursable

0.4

0.6

1.0

0.3

Agency Overhead

7.1

8.2

18.2

25.8

Total

9.0

10.6

27.6

36.3

Full-time permanent staff (total)b

72

77

131b

136

NOTE: Sources of funding are as described in the note accompanying Table 8.1.

aThe significant difference between the FY 2000 and FY 2001 funding and staff levels reflects the reorganization of ITL, in which information technology service groups were moved out of the Convergent Information Systems Division and into this division.

bThe number of full-time permanent staff is as of January of that fiscal year, except in FY 2001, when it is as of March.

people into the management ranks. Although few women and minority management candidates were ultimately identified, the initiative set an important precedent, and the panel supports further efforts in this area.

Software Diagnostics and Conformance Testing Division

Technical Merit

The mission of the Software Diagnostics and Conformance Testing Division is to develop software testing tools and methods that improve quality, conformance to standards, and correctness; to participate with industry in the development of forward-looking standards; and to lead efforts for conformance testing, even at the early development stage of standards.

The division’s work designing conformance and diagnostic tests and developing reference implementations for standards bodies clearly fulfills its mission and is consistent with the goals expressed in both the division and NIST missions. The division is organized in three groups: Software Quality, Interoperability, and Standards and Conformance Testing. The technical merit of the work of all three groups is quite high.

The Software Quality Group develops methods to automate software testing, develops software diagnostic tools, and performs research in formal methods. Projects under way include work on automatic test generation, enterprise single sign-on, quantum information, interactive television, and health care information systems. In the automated test generation project, staff have drawn upon prior work in test harnesses, mutation testing, and specification-based testing. This project is relatively mature, and the panel expects that the current toolset will be transitioned to industry quite soon. The

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

staff have maintained an excellent relationship with Ford Motor Company over the course of this project, and the division should consider Ford and other companies as possible targets for this technology transfer.

If the automated test generation project does continue at NIST, the division might consider adding formal test oracles to the toolset; these could be used for checking test results against specifications of expected and/or desired functionality or other quality properties. In the current toolset, the test data are generated on the basis of specification mutations that are used to provide an adequate range of tests for the software. According to the mutation testing paradigm, a mutation “test” is killed if it provides different results from the original test, but the original test’s “correctness” is determined by some other means. This approach is extremely costly and, if not automated, potentially error-prone. Using the formal specifications to develop automated test oracles would be an effective alternative. Since formal specifications are available in the domains of exploration (consortia and standards bodies), the use of formal specification-based test oracles would improve the quality of the toolset at a limited additional cost.

The Interoperability Group works with other federal government agencies, with the voluntary standards community, and with industry to increase the use of publicly available standards in order to achieve and enhance interoperability. A primary role of this group is that of working with government groups, including the Federal CIO Council, in the application of standards and the development of interoperability tests for IT systems and products that cross several agencies. Current projects include the National Software Reference Library, computer forensics tool testing, and work on metadata standards. Group staff also serve as ITL representatives on a variety of standards committees.

The Standards and Conformance Testing Group develops conformance tests and reference implementations, performs research into better ways to do conformance testing, and, working with industry, develops standards for emerging technologies. Currently, the primary focus areas of this group are XML and pervasive computing. In the XML area, the panel continues to be impressed with how well the division works with industry groups to establish means by which software and systems can interoperate over the Internet. In pervasive computing, a key component of the work is related to architectural description languages (ADLs), which can improve technical specifications of system architectures, especially for those systems in which dynamic adaptation and dependability are required. The panel commends the division’s decision to focus pervasive computing efforts on ADLs and simulation; elements of this project will be relevant to applications well beyond the context of pervasive computing.

The panel has two suggestions about the ADL efforts. One is to consider whether xADL might be relevant to the project. Unlike traditional ADLs, xADL has an emphasis on dynamically reconfigurable architectures and is defined as a set of XML schemas. This approach gives xADL extensibility and flexibility, as well as allowing basic support by many commercial XML tools. The panel’s second suggestion is to consider expanding the intent of the ADL effort beyond improving and extending specifications and to include work on specification-based testing activities. A substantial amount of current work exists in the area of architecture-based testing, where the ideas behind specification-based testing are applied on the basis of formal architecture descriptions. Architecture-based testing is particularly useful in integration, conformance, and interoperability testing, because it is tied to the architectural design level. It is applicable in analysis, test planning, and test generation at the stage of specifying the architectural configuration and then equally applicable in actually testing the software during integration. This approach would complement and support several other division projects and hence seems worth exploring.

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Program Relevance and Effectiveness

The Software Diagnostics and Conformance Testing Division delivers value to users and providers of software through its facilitation of improvements in software quality and interoperability. The division develops products such as reference implementations and conformance test suites, provides technical leadership by chairing standards committees and participating in consortia, and lays the groundwork for overall advancements in this field by researching improved methods of conformance testing. NIST’s role as an active but neutral third party in standards processes, coupled with the outstanding quality of the conformance tests developed by this division, provides government and industry with a service that is both necessary and unique.

The panel continues to be impressed by the division’s focus on emerging technologies and the effectiveness with which it partners closely with industry. Staff work well with a wide variety of organizations (e.g., the Organization for the Advancement of Structural Information Systems [OASIS], the Worldwide Web Consortium [W3C], the Air Transport Association [ATA]), and the division also works directly with individual companies, such as Ford Motor Company, Sun Microsystems, IBM, and Microsoft, on products and applications to improve the interoperability available to users. Overall, the division’s relationships with industry and industry groups are outstanding. The panel does note that the division’s focus on these important activities has limited the time and effort available for publications and presentations in what academics would consider the top-tier journals (e.g., IEEE or ACM transactions) and conferences (e.g., the International Conference on Software Engineering, the International Symposium on the Foundations of Software Engineering, and the International Symposium on Software Testing and Analysis). However, through its more general projects, the division does support various research communities. For example, the ADL work provides a common set of measurements to enable comparison and analysis across systems, and this clearly fulfills an important need of the ADL community.

The federal government clearly benefits, as do all users, from the division’s work to improve the interoperability and performance of commercial software systems. However, the division also has a range of activities targeted directly at assisting a variety of agencies. The highest-profile projects are the National Software Reference Library and the computer forensics tool testing, which serve the law enforcement community at many levels. The work with the Federal CIO Council also continues to be important across government, and various projects are supporting individual agencies. For example, the work on health care information systems is being done in conjunction with the Department of Veterans Affairs.

The division’s effectiveness is exemplified by the XML conformance project. In this effort, the division’s significant contributions to the standards process were critical to the success of XML as a truly “open” standard. While industry itself recognized the value of conformance tests, it was unwilling or unable to commit the resources needed to organize the development of a substantial set of tests for XML. A fledgling effort was established by an industry consortium to undertake this effort, but it failed to generate sufficient support. The division stepped into the partial vacuum created, led a revitalized effort, organized support by industry, and collected tests from a variety of sources. These actions facilitated the open discussion of conformance to the standard by major (and minor) suppliers of XML technology, and the division is primarily responsible for the overall success of the effort and the existence of the standard and the conformance tests that are necessary to allow the use of XML to flourish. W3C has now initiated a quality-assurance activity in this area, and the panel hopes that the division’s experience and expertise will be effectively utilized as industry moves forward on defining XML standards, testing, and usage.

As can be seen in the example above, a key element of NIST’s effectiveness is the division’s good relationships with industry and its ability to work with industry groups such as consortia. Timing is a

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

critical factor, too, as the division’s impact on standards and software is dependent on getting involved early in the standards development process. However, to participate effectively in consortia often requires legal paperwork and agreements, and the division continues to be hindered by the poor responsiveness of the Department of Commerce legal department.

The panel is particularly impressed with the division’s record both on beginning and concluding projects. Division staff have shown good judgment in selecting new areas in which to work, which is particularly impressive given the wide range of standards activities on which this division could potentially have an impact. The division’s philosophy of getting involved early in the standards process (focusing on emerging technologies) and partnering with industry maximizes the value of NIST’s work. In addition, the division has shown a willingness to discontinue work in an area if NIST’s contributions do not appear to be needed or if the technology is not being embraced as anticipated. Finally, the division is good at setting metrics and goals at the beginning of each project so that it will be clear when the objectives have been accomplished and it is time to conclude the project. This year, the Role-Based Access Control project and the Computer Graphics Metafile (CGM) project have been concluded, and the resources largely allocated to other projects. However, NIST staff will continue to support work in these areas when there is a specific industry request for their assistance. This willingness to provide ongoing support when necessary requires very little actual time from NIST staff but is important for ensuring successful technology transfer.

Division Resources

Funding sources for the Software Diagnostics and Conformance Testing Division are shown in Table 8.8. As of January 2002, staffing for the division included 37 full-time permanent positions, of which 33 were for technical professionals. There were also 15 nonpermanent or supplemental personnel, such as postdoctoral research associates and temporary or part-time workers.

TABLE 8.8 Sources of Funding for the Software Diagnostics and Conformance Testing Division (in millions of dollars), FY 1999 to FY 2002

Source of Funding

Fiscal Year 1999 (actual)

Fiscal Year 2000 (actual)

Fiscal Year 2001 (actual)

Fiscal Year 2002 (estimated)

NIST-STRS, excluding Competence

4.8

4.8

4.9

5.3

Competence

0.6

0.5

0.5

0.4

ATP

0.4

0.6

0.2

0.0

OA/NFG/CRADA

0.6

1.0

1.9

2.8

Total

6.4

6.9

7.5

8.5

Full-time permanent staff (total)a

39

37

35a

37

NOTE: Sources of funding are as described in the note accompanying Table 8.1.

aThe number of full-time permanent staff is as of January of that fiscal year, except in FY 2001, when it is as of March.

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

The panel continues to be impressed by the high degree of camaraderie exhibited between and among managers and staff in this division. Various factors appear to contribute to the good morale observed. One is the strong sense of teamwork that exists; communication within the division is very good, and since there are no one-person projects, staff do not feel isolated. In addition, a very high percentage of the division’s activities involve collaborations with other ITL divisions, and these relationships expand division staff’s knowledge of various technical fields and are good for morale. The division has successfully recruited several new staff members in the past year, and the panel considers that the division is quite healthy. Only two concerns were expressed by staff in informal interactions with the panel. One is the difficulties experienced in dealing with the NIST procurement system; the delays in purchasing equipment are particularly frustrating in the IT arena, as the technology available changes rapidly and the division requires access to relevant hardware and software to have an impact on industry. The second concern was the division’s budget deficit as of February 2002; it was not clear where the funds necessary to make up this gap would come from.

Statistical Engineering Division

Technical Merit

The mission of the Statistical Engineering Division is to advance measurement science and technology by collaborating on NIST multidisciplinary research, by formulating and developing statistical methodology for metrology, and by applying statistical principles and methodology to the collection and analysis of data critical to NIST scientists and engineers.

The division is involved in a broad range of activities, including the provision of support to NIST scientific research, collaborative multidisciplinary research with NIST scientists, development of new statistical methodology with a special focus on metrology, and the transfer of statistical methodology to NIST scientists and the broader scientific community. The demand for collaborative interactions with division staff continues to be very high. Less than 2 years ago, a new division chief was hired; her primary task has been to rebuild the Statistical Engineering Division into a premier national resource for statistical sciences. Great progress has been made on this task, and continuing efforts are essential to allow the division to keep pace with the statistical demands arising from new technologies that are being applied to measurement systems and metrology. The health and activities of this division are crucial elements of the success of future NIST research.

The Statistical Engineering Division is located primarily at NIST Gaithersburg, where the staff are split into two groups: the Measurement Process Evaluation Group and the Statistical Modeling and Analysis Group. In addition, a group of staff from this division work at NIST Boulder where they are close to collaborators in the EEEL, CSTL, PL, and MSEL divisions located on that campus. Projects in a wide variety of fields are currently under way in the Statistical Engineering Division. Below, the panel describes several ongoing activities, but these efforts are just a few examples of the division’s many successful projects.

The project highlights discussed in this section fall into three categories: Bayesian methodology, uncertainty analysis for key comparisons, and uncertainty analysis for process measurements. In the first area, the division has made major contributions through its work on Bayesian metrology. A fundamental problem in metrology is the assessment and assignment of realistic uncertainty to measurement results. In many complex problems, such as the analysis of high-throughput measurements, high-dimensional data, and complex dynamical systems, it is important to combine expert knowledge and prior information with physical measurements. The researchers in the Statistical Engineering Division

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

have adopted the Bayesian framework to solve these sorts of problems. This framework provides a scientific basis and formal approach to utilizing scientific knowledge and prior information to yield better design of experiments and testing strategies. The application of this approach has produced a number of key achievements in the past year in the areas of interlaboratory intercomparisons, international key comparisons among national measurement institutes (NMIs), elicitation of prior information to calculate uncertainties, and development of nonparametric Bayesian models using empirical distributions.

One example of the important applications of Bayesian methodology is the international study of the sublethal effects of fire smoke on survivability and health, a joint project between the division and the NIST Building and Fire Research Laboratory. The goal was to obtain consensus values and uncertainty measures for lethal and incapacitating toxic potency values for large numbers of building materials based on data from the many different studies that have been published. The challenge was that the quality of the data varied greatly from study to study. Statistical Engineering Division researchers developed a Bayesian hierarchical model to combine the data from different studies with and without uncertainty measures by constructing vague priors at the lowest level of the hierarchy. The results of this project have had a large impact on the building industry at both the national and international level, and the success of the method attracted a great deal of interest from other researchers at NIST.

In the area of key comparisons and uncertainty analysis, the division has taken a lead in international efforts to establish equivalence among the many national standards organizations throughout the world. The mechanism for these efforts is the Mutual Recognition Agreement among the NMIs and regional metrology organizations that belong to the International Committee for Weights and Measures (CIPM). In this work, the greatest challenge for the division has been to develop a set of sound statistical design and analysis procedures to be used in interlaboratory studies for establishing equivalence of national standards. Key comparisons have five critical phases: (1) agreement among NMI scientists on the specific transfer standard (and/or measurement process), (2) design of the multinational experiment, (3) data collection at each NMI, (4) determination of the reference value and assessment of standard uncertainty at each NMI, and (5) determination and reporting of the level of equivalence among the participating NMIs and the related uncertainties. In the past year, division statisticians have developed a unified approach to experimental design and analysis to be applied in the work on key comparisons. Facilitation of key comparisons is an important element of NIST’s support of the United States in the recent trend toward open markets and globalization.

The value of the division’s expertise in data comparison can be seen even in comparisons that predate the MRA. Data for comparison of laboratories’ realizations of the International Temperature Scale were collected over several years by 15 laboratories around the world prior to the signing of the MRA. As a result, no information was available as to whether the submitted components of uncertainty were for an individual measurement or for the mean of replicated measurements. Other problems related to determining how each uncertainty component contributed to the measurement error of the process, what uncertainties were associated with the standard platinum resistance thermometers used as transfer instruments, how to appropriately compute coverage factors to obtain expanded uncertainties with correct confidence levels, and, finally, how to explain the effects arising from diverse paths for computing temperature differences across subsets of laboratories. Working jointly with the NIST Chemical Science and Technology Laboratory (CSTL) and two laboratories in Germany and Australia, the Statistical Engineering Division was able to overcome all of these challenges and to produce useful results from the comparison data. These results are having a significant impact on the international temperature standards and on the sales of temperature-related equipment or services between different countries. The success of this key comparison makes it a role model for future comparisons.

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

Another project that is a benchmark example of what is possible in a key comparison is the division’s work on statistical uncertainty analysis for the comparison of resistance standards. The NIST Electronics and Electrical Engineering Laboratory (EEEL) and 14 other NMIs participated in the CIPM Consultative Committee for Electricity and Magnetism’s Comparisons of Resistance Standards. Two types of dc resistors were used as traveling standards and were measured by different laboratories at different time periods. Systematic drifts of the traveling standards and laboratory measurement uncertainties were the main causes of discrepancies among the measurements. However, the Statistical Engineering Division staff developed an accurate statistical model based on linear regression to combine the measurements, and this model now can provide a basis for calibrating the high-resistance standards of the laboratories’ customers.

Another important contribution of the Statistical Engineering Division is the study of uncertainties associated with process measurements. Data from these types of measurements, such as fluid-flow measurements, information about high-speed optoelectronic signals, and measurement of spray characteristics, occur frequently in the NIST laboratories. In the past year, the division staff have worked collaboratively with other NIST scientists on solving difficult problems, using expertise in statistical signal processing, time series, and statistical smoothing techniques.

One such joint project is with the CSTL on flow measurements for multimeter transfer standards. The result of this work is the implementation of an in-house prototype system to evaluate in detail the behavior of dual meter systems. With new understanding from this prototype, an efficient experimental design has been tested and then modified specifically for the international key comparison setting. In conjunction with the experimental design development, methodology for data analysis has been put into place for the initial key comparison, for which NIST is the pilot laboratory. The time spent on this project was highly leveraged, as the new protocol is serving as the prototype for the other five areas under study in the CIPM’s Working Group for Fluid Flow and as the basis for all future international flow comparisons.

The importance of the division’s expertise and experience in statistical issues related to signal processing can be seen in the work with EEEL on high-speed optoelectronic measurements. Division staff have developed state-of-the-art statistical signal processing techniques to reduce the random component of the timing error and the systematic component of time-base distortion in these measurements. Using a regression spline model, the average of the aligned signals is interpolated onto an equally spaced time grid based on estimated time-base distortion, and the resulting power spectrum is then corrected for jitter effects by an estimated multiplier. The laboratory’s new measurement capability will be used to support industrial applications in the areas of gigabit ethernet networks, fiber channels, optical telecommunications, and wireless communications. The results have been published in IEEE Transactions on Instrumentation and Measurements.

Program Relevance and Effectiveness

As demonstrated by the preceding examples, the efforts of the Statistical Engineering Division have a broad impact on the work of the NIST scientists and engineers with whom the division collaborates. Since NIST’s mission and activities focus on measurement science, the division staff have developed some capabilities that are unique within the scientific community. These capabilities include expertise and experience in techniques for statistical uncertainty analysis for measurement science (i.e., measurement processes) and statistical methods for metrology. These are important capabilities, and the panel firmly believes that the division can and should play a pivotal role in NIST’s support of U.S. industry by promoting industrial statistics and by helping to link key statistical groups in academia and industry and

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

at U.S. national laboratories. For the past few years, the division has appropriately focused on rebuilding its reputation and maintaining a strong portfolio of work within NIST (and ITL), but now that the division is growing stronger, increased focus on external relationships and responsibilities should be the natural next step.

Methods for technology transfer is one area in which external organizations might benefit from learning more about the Statistical Engineering Division’s projects and approaches. The division staff have an excellent reputation for their ability to turn research projects into standard methods, tools, and software that can be easily used by NIST scientists. This is a type of technology transfer, and these products are particularly valuable because they continue to raise the statistical competency of NIST scientists while avoiding a pitfall commonly experienced in other organizations—that is, scientists becoming dependent on statisticians for experimental analysis, thereby limiting the time that the statisticians have to push the leading edge of statistical research and development. Not only are NIST statisticians adept in technology transfer within NIST, but division staff are also actively involved in general education about statistical uncertainty and promote statistical methodologies by offering short courses and workshops and by producing the Web-based NIST/SEMATECH Engineering Statistics Internet Handbook.13 Other statistics organizations in industry and at the national laboratories struggle with technology transfer and could learn a great deal from the division’s successes in this area.

The Statistical Engineering Division plays an important role in the national and international metrology communities through contributions to documents and handbooks of standard methodology, promotion of statistical approaches to metrology, collaboration on international experiments, participation in international metrology organizations, publications in leading journals in metrology, and assistance to other NMIs through training and collaborations. The value of the division’s work is known and appreciated throughout relevant physical sciences communities as a result of the dissemination efforts mentioned above and the large number of publications that result from the division’s collaborative efforts and which appear in subject-matter scientific journals.

However, the statistical sciences community should also be benefiting from the division’s unique expertise and experiences. Division staff do publish in statistics journals and present at statistics conferences, and the panel was pleased to see a surge in dissemination of the division’s research in peer-reviewed journals, as suggested in last year’s assessment report. Nonetheless, the burden is still on the division to facilitate stronger interactions with the statistical sciences community as a whole. One element of achieving higher visibility in the statistics discipline and maximizing the division’s impact on the statistics community should be that of generalizing the methodologies from specific problems to show their relevance to solving a wider range of similar problems and making these results broadly available through publications in leading statistical journals.

Division Resources

Funding sources for the Statistical Engineering Division are shown in Table 8.9. As of January 2002, staffing for the division included 19 full-time permanent positions, of which 17 were for technical professionals. There were also 12 nonpermanent or supplemental personnel, such as postdoctoral research associates and temporary or part-time workers.

13  

The NIST/SEMATECH Engineering Statistics Handbook is available online at <http://www.itl.nist.gov/div898/handbook/index.html>.

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×

TABLE 8.9 Sources of Funding for the Statistical Engineering Division (in millions of dollars), FY 1999 to FY 2002

Source of Funding

Fiscal Year 1999 (actual)

Fiscal Year 2000 (actual)

Fiscal Year 2001 (actual)

Fiscal Year 2002 (estimated)

NIST-STRS, excluding Competence

2.9

3.0

3.4

3.7

Competence

0.5

0.6

0.3

0.5

STRS—supercomputing

0.1

0.0

0.0

0.0

ATP

0.0

0.0

0.2

0.3

Measurement Services (SRM production)

0.0

0.0

0.1

0.5

OA/NFG/CRADA

0.1

0.1

0.1

0.4

Total

3.6

3.7

4.1

5.4

Full-time permanent staff (total)a

23

19

17a

19

NOTE: Sources of funding are as described in the note accompanying Table 8.1.

aThe number of full-time permanent staff is as of January of that fiscal year, except in FY 2001, when it is as of March.

As noted above, the progress evident since the hiring of the new division chief in the fall of 2000 is extraordinary. Morale is now high, top-rate visiting researchers have been engaged to fill technical gaps, and new junior personnel have been hired (in spite of difficulties due to the governmentwide hiring freeze imposed by the change in administration). It is clear that the division is moving quickly in exactly the right direction. However the division’s recovery is still at a somewhat fragile stage, and continued support from ITL and NIST management will be needed to fulfill the division’s long-term goals. A 5-year plan is in place to increase the division’s full-time permanent technical staff from 16 to 24, and the division is on track in year 2 of the plan. The growth should be a combination of junior and senior hires, and, to continue to move forward, the division will require a commitment from management to steadily increase the division’s core support.

The Statistical Engineering Division has done well in competitions for internal research and development funds, and this year has seen a significant increase in the division’s activities related to the production of standard reference materials (SRMs). This growth is due primarily to the staff’s outstanding development of educational courses for outreach to NIST researchers, who have in turn realized the opportunities presented by working with the division on SRM-related projects. The panel believes that the staff should aggressively continue these outreach activities, which produced the surge in national and international SRM work.

The panel continues to be concerned about the relative isolation of the Statistical Engineering Division in its current location at NIST North. The issues related to this recurring concern have been discussed at length in many past assessment reports. If NIST is to obtain the maximum value possible from the division, the panel strongly urges NIST management to consider relocating this division to the main campus. If relocation is not an option in the near term, NIST management should actively work with ITL and division management on other creative approaches to solving this problem.

Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
This page in the original is blank.
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 261
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 262
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 263
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 264
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 265
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 266
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 267
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 268
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 269
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 270
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 271
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 272
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 273
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 274
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 275
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 276
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 277
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 278
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 279
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 280
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 281
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 282
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 283
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 284
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 285
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 286
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 287
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 288
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 289
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 290
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 291
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 292
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 293
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 294
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 295
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 296
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 297
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 298
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 299
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 300
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 301
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 302
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 303
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 304
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 305
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 306
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 307
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 308
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 309
Suggested Citation:"8 Information Technology Laboratory." National Research Council. 2002. An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002. Washington, DC: The National Academies Press. doi: 10.17226/10510.
×
Page 310
Next: 9 Measurement Services »
An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002 Get This Book
×
 An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002
Buy Paperback | $98.00 Buy Ebook | $79.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

This assessment of the technical quality and relevance of the programs of the Measurement and Standards Laboratories of the National Institute of Standards and Technology is the work of the 165 members of the National Research Council's (NRC's) Board on Assessment of NIST Programs and its panels. These individuals were chosen by the NRC for their technical expertise, their practical experience in running research programs, and their knowledge of industry's needs in basic measurements and standards.

This assessment addresses the following:

  • The technical merit of the laboratory programs relative to the state of the art worldwide;
  • The effectiveness with which the laboratory programs are carried out and the results disseminated to their customers;
  • The relevance of the laboratory programs to the needs of their customers; and
  • The ability of the laboratories' facilities, equipment, and human resources to enable the laboratories to fulfill their mission and meet their customers' needs.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!