8
Information Technology Laboratory



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 261
An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002 8 Information Technology Laboratory

OCR for page 261
An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002 PANEL MEMBERS Tony Scott, General Motors Corporation, Chair Albert M. Erisman, Institute for Business, Technology, and Ethics, Vice Chair Michael Angelo, Compaq Computer Corporation Bishnu S. Atal, AT&T Laboratories-Research Matt Bishop, University of California, Davis Linda Branagan, Secondlook Consulting Jack Brassil, Hewlett-Packard Laboratories Aninda DasGupta, Philips Consumer Electronics Susan T. Dumais, Microsoft Research John R. Gilbert, Xerox Palo Alto Research Center Roscoe C. Giles, Boston University Sallie Keller-McNulty, Los Alamos National Laboratory Stephen T. Kent, BBN Technologies Jon R. Kettenring, Telcordia Technologies Lawrence O’Gorman, Avaya Labs David R. Oran, Cisco Systems Craig Partridge, BBN Technologies Debra J. Richardson, University of California, Irvine William Smith, Sun Microsystems Don X. Sun, Bell Laboratories/Lucent Technologies Daniel A. Updegrove, University of Texas, Austin Stephen A. Vavasis, Cornell University Paul H. von Autenried, Bristol-Myers Squibb Mary Ellen Zurko, IBM Software Group Submitted for the panel by its Chair, Tony Scott, and its Vice Chair, Albert M. Erisman, this assessment of the fiscal year 2002 activities of the Information Technology Laboratory is based on a site visit by the panel on February 26-27, 2002, in Gaithersburg, Md., and documents provided by the laboratory.1 1   U.S. Department of Commerce, Technology Administration, National Institute of Standards and Technology, Information Technology Laboratory Technical Accomplishments 2001, NISTIR 6815, National Institute of Standards and Technology, Gaithersburg, Md., November 2001; U.S. Department of Commerce, Technology Administration, National Institute of Standards and Technology, Report to the ITL Assessment Panel, National Institute of Standards and Technology, Gaithersburg, Md., February 2002; U.S. Department of Commerce, Technology Administration, National Institute of Standards and Technology, Information Technology Laboratory Publications 2001, National Institute of Standards and Technology, Gaithersburg, Md., February 2002.

OCR for page 261
An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002 LABORATORY-LEVEL REVIEW Technical Merit The mission of the Information Technology Laboratory (ITL) is to strengthen the U.S. economy and improve the quality of life by working with industry to develop and apply technology, measurements, and standards. This mission is very broad, and the programs not only encompass technical and standards-related activities but also provide internal consulting services in mathematical and statistical techniques and computing support throughout NIST.2 To carry out this mission, the laboratory is organized in eight divisions (see Figure 8.1): Mathematical and Computational Sciences, Advanced Networking Technologies, Computer Security, Information Access, Convergent Information Systems, Information Services and Computing, Software Diagnostics and Conformance Testing, and Statistical Engineering. The activities of these units are commented on at length in the divisional reviews in this chapter. Below, some highlights and overarching issues are discussed. The technical merit of the work in ITL remains strong. As part of its on-site reviews, the panel had the opportunity to visit each of the divisions for a variety of presentations and reviews related to the projects currently under way. While it is not possible to review every project in the greatest detail, the panel has been consistently impressed with the technical quality of the work undertaken. The panel also particularly applauds ITL staff’s willingness to take on difficult technical challenges while demonstrating an appropriate awareness of the context in which NIST results will be used and the importance of providing data and products that are not just correct and useful but also timely. Many examples of programs with especially strong technical merit are highlighted in the divisional reviews. The panel is very pleased to see the progress that has occurred in strategic planning in ITL. A significant development over the past year has been the emergence and acceptance of a framework under which the laboratory activities operate. The framework includes the ITL Research Blueprint and the ITL Program/Project Selection Process and Criteria. The panel observed that these descriptions and tools appear to be well institutionalized within each of the divisions and seem to be having a positive initial impact on improving the direction and efficacy of laboratory projects and programs. These frameworks were widely used in the presentations made to the panel, and the panel noted the emergence of a common “vocabulary” with respect to planning and strategy. Increased collaborations between divisions were also observed. The panel also continues to see progress in the divisions on rational, well-justified decisions about what projects to start and conclude and when to do so. Program Relevance and Effectiveness ITL has a very broad range of customers, from industry and government and from within NIST, and the panel found that the laboratory serves all of these groups with distinction. In addition to the panel’s expert opinion, many quantitative measures confirm the relevance and effectiveness of ITL’s programs. One is the level of interaction between laboratory staff and their customers, which continues to rise. Attendance is up at ITL-led and -sponsored seminars, workshops, and meetings; staff participation in standards organizations and consortia is strong; and laboratory staff have robust relationships with researchers and users from companies, governmental agencies, and universities. 2   In February 2002, NIST management announced that the computing services functions currently housed in ITL will be moved into a separate unit, headed by a chief information officer (CIO) who will report directly to the NIST director. This transition is discussed in the “Program Relevance and Effectiveness” subsection, below.

OCR for page 261
An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002 FIGURE 8.1 Organizational structure of the Information Technology Laboratory. Listed under each division are the division’s groups. Another visible measure of the quality and relevance of ITL’s work is the number of awards that laboratory staff receive from NIST, the Department of Commerce, and external sources. Examples include a Department of Commerce Gold Medal and an RSA Public Policy Award for the work on the Advanced Encryption Standard, an R&D 100 Award for the development of the Braille Reader, a series of awards from the National Committee for Information Technology Standards for leadership in standards-related activities such as the work on standards for geographic information systems, and the election of a staff member as a fellow of the American Society for Quality because of his work on

OCR for page 261
An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002 applying statistics to measurement sciences. These honors, spread across the various divisions, recognize outstanding technical and program achievement at numerous levels. ITL’s interactions with and impact on industrial customers continue to improve each year, and the panel applauds the laboratory’s ability to produce and disseminate results of value to a broad audience. ITL primarily serves two kinds of industrial customers: computer companies (i.e., makers of hardware and software) and the users of their products (which include companies from all sectors, government, and, to some extent, the public). The divisional reviews later in this chapter contain many examples of how ITL makes a difference. Notable cases include the Advanced Networking Technologies Division’s success at raising the visibility of co-interference problems between IEEE 802.11 and Bluetooth wireless networks and NIST’s technical contributions to evaluating possible solutions; the Convergent Information Systems Division’s development of an application that can preview how compressed video appears on different displays, thus allowing producers to make decisions about the amount of compression in light of the equipment likely to be used by the target audience; and the Software Diagnostics and Conformance Testing Division’s facilitation of the development of an open standard and needed conformance tests for extensible markup language (XML). In addition to serving all of these customers, ITL projects also have been known to have an impact worldwide. For example, standards developed with NIST’s help and leadership in the security, multimedia, and biometrics areas are all used throughout the relevant international technical communities. In last year’s assessment report,3 the panel expressed concerns about industry trends in standards development that would affect ITL’s ability to effectively and openly help industry adopt the most appropriate standards for emerging technologies. The growing use of consortia and other private groups in standards development processes places a burden on ITL, which has to strike a balance between its obligation to support and encourage open processes and its need to be involved as early as possible in standards-setting activities so as to maximize the impact of ITL’s experience and tools. In some cases, a delicate trade-off must be made between participating in a timely way in organizations that will set standards for the industry and avoiding endorsement of standards set by exclusive groups. ITL’s role as a neutral third party and its reputation as an unbiased provider of technical data and tools have produced significant impact in many areas and should not be squandered by association with organizations that unreasonably restrict membership. The panel continues to urge ITL to establish a policy to help divisions decide when participation in closed consortia is appropriate and to consider how NIST can encourage industry to utilize open, or at least inclusionary, approaches to standards development. Given that consortia, in some form or another, are here to stay and that in some cases it will be vital for NIST to participate in these consortia, the panel supports the efforts recently made by ITL and NIST to work on the internal legal roadblocks to participation, but it suggests that this work could be supplemented by efforts to educate external groups, such as consortia members and lawyers, on ways to facilitate NIST’s timely participation and technical input. This is a customer outreach effort as well as a legal issue. One customer that relies significantly on ITL’s products and expertise is the federal government, which often uses NIST standards and evaluation tools to guide its purchase and use of information technology (IT) products, particularly in the computer security area. An example is the Computer Security Division’s Cryptographic Module Validation Program (CMVP), which has enabled purchasers, 3   National Research Council, An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2001, National Academy Press, Washington, D.C., 2001.

OCR for page 261
An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002 including the U.S. government, to be sure that the security attributes of the products that they buy are as advertised and appropriate. In the Information Access Division, the new Common Industry Format (CIF) standard provides a foundation for exchanging information on the relative usability of products and is already being used for procurement decisions by several large enterprises. Another key ITL activity relevant to the federal government is the work on fingerprint and face recognition. NIST standards and data have played a key role in the development of automated fingerprint identification systems. Also, since the attacks of September 11, 2001, there has been significant pressure to increase the reliability of biometric recognition technologies, especially face recognition. ITL’s existing, long-term programs and expertise in face, fingerprint, and gait biometrics will provide test data that will help drive system development and help government evaluation of systems capabilities. Programs such as the work on biometrics, especially face recognition, highlight a question relevant to many information technology activities: that is, in what context will technological advances be used? Information technology is often an enabling technology that will produce new capabilities with expected and unexpected benefits and costs.4 The panel acknowledges that ITL’s primary focus is on technical questions and technical quality, but it emphasizes that for the laboratory’s work to be responsible and for the results to be taken seriously in the relevant communities, recognition of the context in which new technologies will be applied is very important. This context has two elements: the deployment of the technology and the social implications of the technology. In the first area, the deployment questions relate to the functionality of the systems in which new technical capabilities will be used. A testbed is not necessarily meant to determine the “best” technology but rather the one that works well enough to meet the needs for which it is being developed. Often, the process of considering the possible applications of a technology results in a broader appreciation of the potential benefits. For example, appropriate security is actually an enabler that allows e-business, the globalization of work, collaboration across geography, and so on. Understanding the ultimate goals for new technologies relates to the social implications questions. For example, security has serious implications for privacy. The panel emphasizes that in many of the ongoing programs—such as the work on the potential use of face recognition technologies as security systems in public places—ITL staff made long and arduous efforts to comply with existing privacy legislation. However, when describing the NIST results to public groups (such as the panel), staff should also be sure to take the time to acknowledge the privacy questions and describe potential future issues, as well as discussing the capabilities and benefits of the technological advancements. Following are two examples of areas in which the panel believes that the potential societal issues or the actual context in which technologies would be used were not being fully considered. The first example is the suggestion that a commercial application for face recognition could be that of having an automated teller machine (ATM) recognize a user with Hispanic features and automatically switch to using Spanish. As many people of Hispanic (or Swedish or Asian) appearance are not in fact speakers of the “native” language implied by their looks, this is a naïve (and perhaps inappropriate) example of the technology’s potential. The second example is in the area of pervasive computing, where NIST’s work on “smart” meeting facilities was demonstrated for the panel. Recording meetings for search and archiving can offer significant benefits in some contexts, but it can also inhibit certain types of discussions. For example, the effectiveness of brainstorming sessions or examinations of “what if” scenarios 4   How the social context can provide a framework for information technology development is discussed at length in the following report: Computer Science and Telecommunications Board, National Research Council, Making IT Better: Expanding Information Technology Research to Meet Society’s Needs, National Academy Press, Washington, D.C., 2000.

OCR for page 261
An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002 might be significantly limited if the participants thought the discussion might later be taken out of context and broadcast. In addition to strong relationships with customers in industry and in the federal government, ITL places significant emphasis on effectively serving its customers within NIST. The panel commends the focus in both the Mathematical and Computational Sciences Division and the Statistical Engineering Division on building robust collaborations with scientists and engineers throughout ITL and the other NIST laboratories. One example is the work of the Mathematical and Computational Sciences Division on the mathematical modeling of solidification with staff from the Materials Science and Engineering Laboratory; another is the Statistical Engineering Division’s development of a method to combine data from diverse building materials studies for the Building and Fire Research Laboratory. A primary current responsibility of ITL is that of IT support for all of NIST. The relevant activities—which include the support and maintenance of campus networking, personal computers (PCs), administrative applications (such as accounting software), and telephones—are performed by the Information Services and Computing Division. These service programs were unified in this division in December 2000, and the panel is very pleased at the significant progress observed in the past 2 years. The quality and effectiveness of the support functions have improved and so has the overall planning and strategic approach to providing the relevant services. A “NIST IT Architecture” has been developed, and it should help provide context and scope for each of the subarchitectures and various support functions at NIST. Other recent accomplishments include the formation and centralization of a NIST-wide help desk and increased standardization around core processes such as PC procurement. Issues do still exist, however, including a lack of ability for this division’s staff to enforce or even check compliance with centralized IT standards or policies. For example, many units at NIST do their own systems administration, which could result in uneven implementation of appropriate security applications. The key issue for IT services at NIST in the next year will be an organizational transition. In February 2002, NIST management announced that the support functions currently housed in ITL will be moved out of the laboratory into a separate unit, headed by a chief information officer (CIO) who will report directly to the NIST director. Since a significant problem for the current unit is the difficulty in getting the NIST laboratories to embrace consistent, institutionwide standards for IT systems, raising the services unit to a level equivalent with the laboratories may provide needed visibility for the issue. Another factor that may help is the emphasis by the current director of this new unit (the acting CIO) on demonstrating to the other NIST laboratories how IT services can facilitate their research and how standardizing basic applications can save time and money. Achieving acceptance of this new unit and centralized IT support across NIST will be a serious leadership challenge, as this approach will be a cultural shift for NIST. The panel encourages benchmarking with organizations such as Agilent Technologies that have successfully made such a transition. Making the IT services component of NIST a separate unit rather than a division of ITL may bring it closer to other laboratories; however, it is important that this unit maintain close ties with ITL programs. For example, some of the work being done in the Computer Security Division can and should be applied to the security of the NIST system. Work on technologies for meetings can be tested and effectively used throughout NIST. Applying the development work of ITL’s research divisions to NIST as a whole will require the continued tracking in the services unit of relevant ongoing projects and the recognition in ITL of the potential for using NIST as a whole as a testbed. ITL has done a remarkable job of becoming more customer-oriented over the past several years. The panel applauds the laboratory’s efforts in outreach and notes that the progress reflects improvement in a whole range of areas—for example, gathering wider and more useful input, helping with project selection, and increased dissemination and planning for how customers will utilize NIST results and

OCR for page 261
An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002 products. ITL has supported this increased focus on its customers by measuring outputs and outcomes that provide data on how the laboratory is doing in this area. (One example is that of tracking the number of times ITL-developed standards and technology are adopted by government and industry.) Now that ITL is serving its customers so well, the panel wants to suggest that some attention could also be paid to strengthening the laboratory’s reputation and stature with its colleagues in relevant research communities. Customers are uniquely positioned to assess the timeliness of and need for ITL results, but ITL’s peers can and should assess the technical excellence of the laboratory’s work. A variety of reasons support having input from both groups, that is, having a balanced scorecard for the laboratory’s portfolio. One reason is that sometimes customer satisfaction is not the right metric, since NIST can, and in some cases should, hold companies to higher standards than the companies might wish. Another reason is that elevating the stature of ITL researchers in their peer communities can raise NIST’s credibility with its customers. Therefore, in the future the panel hopes to see increased emphasis on ITL’s visibility within relevant research communities. Increased visibility, such as ITL’s successful efforts to improve customer relationships, can be driven by the use of appropriate metrics. It is not entirely clear what outputs or events will effectively measure ITL’s work in this area. Possibilities include but are not limited to the number of times that staff are named as nationally recognized fellows of professional organizations (such as IEEE, the Association for Computing Machinery [ACM], the American Physical Society, and the American Society for Quality), the number of times ITL staff are featured speakers at high-profile conferences, and the number of staff publications in top-tier peer-reviewed IT journals. The metrics will obviously depend on the field in which ITL’s research is occurring. The panel acknowledges that it is often inappropriate to compare NIST researchers directly with people working in industry research units or at universities, because ITL’s role of producing test methods, test data, standards, and so on is different from industrial or academic activities and is often unique. However, ITL’s peers at these other institutions are still in a position to recognize and evaluate the technical merit and quality of the NIST programs. The panel is not suggesting that recognition by external peer communities should replace responsiveness to customer needs as a primary focus, but it is instead suggesting that ITL perform the difficult balancing act of putting more emphasis on publication and interaction in the relevant research community without losing its focus on its customers. Laboratory Resources Funding sources for the Information Technology Laboratory are shown in Table 8.1. As of January 2002, staffing for the laboratory included 389 full-time permanent positions, of which 319 were for technical professionals. There were also 105 nonpermanent or supplemental personnel, such as postdoctoral research associates and temporary or part-time workers. The panel’s primary concern in the area of human resources is the April 2002 retirement of the current director of ITL. The panel has observed and laboratory staff have explicitly stated that morale is at an all-time high in ITL, due in large part to the director’s leadership style and direction. A great deal of concern has surfaced among the staff over the process for filling the director’s slot, how long it will take, and what the caliber and style of the next director will be. The panel recommends that NIST leadership focus on providing clear communication to staff about the selection criteria and frequent updates as to the progress of the search and hiring process. Sharing relevant information will certainly help the transition proceed more smoothly. One facilities issue highlighted in last year’s assessment report was the location of five divisions in NIST North. The existence and use of NIST North is a perennial issue. The panel recognizes that the

OCR for page 261
An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002 TABLE 8.1 Sources of Funding for the Information Technology Laboratory (in millions of dollars), FY 1999 to FY 2002 Source of Funding Fiscal Year 1999 (actual) Fiscal Year 2000 (actual) Fiscal Year 2001 (actual) Fiscal Year 2002 (estimated) NIST-STRS, excluding Competence 31.6 31.9 44.4 38.8 Competence 1.5 1.6 1.1 1.3 STRS—Supercomputing 12.1 12.0 11.9 10.0 ATP 1.8 2.4 2.3 2.0 Measurement Services (SRM production) 0.0 0.0 0.1 0.5 OA/NFG/CRADA 8.4 9.9 12.2 14.6 Other Reimbursable 0.5 1.6 1.0 0.3 Agency Overhead 14.4 16.4 18.4 28.2 Total 70.3 75.8 91.4 95.7 Full-time permanent staff (total)a 381 381 368a 389 NOTE: Funding for the NIST Measurement and Standards Laboratories comes from a variety of sources. The laboratories receive appropriations from Congress, known as Scientific and Technical Research and Services (STRS) funding. Competence funding also comes from NIST’s congressional appropriations but is allocated by the NIST director’s office in multiyear grants for projects that advance NIST’s capabilities in new and emerging areas of measurement science. Advanced Technology Program (ATP) funding reflects support from NIST’s ATP for work done at the NIST laboratories in collaboration with or in support of ATP projects. Funding to support production of Standard Reference Materials (SRMs) is tied to the use of such products and is classified as “Measurement Services.” NIST laboratories also receive funding through grants or contracts from other [government] agencies (OA), from nonfederal government (NFG) agencies, and from industry in the form of cooperative research and development agreements (CRADAs). All other laboratory funding, including that for Calibration Services, is grouped under “Other Reimbursable.” aThe number of full-time permanent staff is as of January of that fiscal year, except in FY 2001, when it is as of March (due to a reorganization of ITL that year). quality of the space in NIST North is significantly better than what would be available on campus; however, access to these improved facilities does not compensate for the distance from the rest of the campus for two of the ITL divisions—the Mathematical and Computational Sciences and the Statistical Engineering Divisions. The distance inhibits informal interactions of the staff of these two divisions with their collaborators in the other laboratories on the main campus. Thus, ITL management has submitted the space requirements of these divisions to NIST management, which will be making revised space allocation decisions related to the new Advanced Measurement Laboratory (AML), due to be completed in 2004. The panel encourages NIST management to make a serious effort to move these two divisions back to the main campus.5 However, 2004 is still several years away. In the meantime, the panel continues to note that a mix of systems taking into account technological and social factors could help compensate for the 5   One group in the Mathematical and Computational Sciences Division, the Scientific Applications and Visualization Group, is already located on the main campus.

OCR for page 261
An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002 distance. Tools such as videoconferencing, Web collaboration packages, and Web broadcasting can support nonphysical interactions, but regular, scheduled (and subsidized) opportunities for face-to-face meetings are necessary to make these technical solutions most effective. These approaches are applicable to the NIST North/main campus gap, as well as to the Gaithersburg/Boulder divide. A second facilities issue raised in the 2001 assessment report was the poor network connectivity of NIST to the outside world. The panel was very pleased to learn that since the last review, NIST has joined the Internet 2 project, thus dramatically improving the connectivity and placing NIST on a par with the major universities and industrial research organizations that participate in this project. The next step will be educating researchers in the other laboratories at NIST about how to take full advantage of this new capability. The panel met with staff in “skip-level” meetings (sessions in which management personnel were not present). The key message from these meetings was that in the past few years, under the current management of the laboratory, ITL has become an especially enjoyable place to work, noted for such attributes as respect for the individual, stability, an appropriate level of flexibility, and focus on visible results. The panel also observed this high level of morale in visits to individual divisions. Turnover in ITL was approximately 9 percent this year, down slightly from last year. Although turnover has decreased in industry in the past year and is now about comparable to that in ITL, over the last several years ITL has had a remarkably low comparative turnover rate for an IT organization. The panel applauds laboratory and division management for creating such a positive work environment. Some issues were brought up in the skip-level meetings. The panel cannot judge if these concerns are broad-based or isolated but does note that perhaps laboratory management should be aware of them. For example, ITL staff said that while relationships with the other NIST laboratories had improved, they still felt that ITL did not have the same status or prestige that other laboratories enjoy. The panel notes that continued interaction with staff in other laboratories, internal and external recognition of staff, and cross-laboratory projects will help ameliorate imbalances or perceptions of “second-class” status. The shift of IT support services to a separate unit also might help emphasize to the rest of NIST that the core mission of ITL is the same as that of the rest of the laboratories. Other concerns expressed by staff included perceived inconsistencies in performance measurement and some related frustrations about apparently unequal burdens of work owing to the difficult process for firing poor performers in the federal system. Such perceptions, if they indeed exist on a broader scale in ITL, would not be unique to ITL, NIST, government agencies, or even businesses in general. Laboratory Responsiveness The panel found that, in general, ITL has been very responsive to its prior recommendations and observations. The panel’s comments appear to be taken very seriously, and the suggestions made in the assessment reports are often acted on, especially as related to the redirection and conclusion of projects. When advice is not taken, ITL usually provides a good rationale for why a given action has not occurred. Examples of positive responses to suggestions made in last year’s report include the improved strategic planning observed in the Mathematical and Computational Sciences Division, the redirection of the work on distributed detection in sensor networks in the Advanced Networking Technologies Division, the transfer of the latent fingerprint workstation to a Federal Bureau of Investigation (FBI) contractor in the Information Access Division, and the work on connecting NIST to Internet 2 in the Information Services and Computing Division. More discussion of responsiveness and of areas needing continued attention is presented in the divisional reviews below.

OCR for page 261
An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002 In some areas, the issues raised by the panel are long-term questions or areas in which changes are not entirely within ITL’s power. In these cases, the panel looks to see if serious effort has been made. Usually the panel observes some progress and plans to follow up on the issues in future assessments. The location of the MCSD and SED Divisions in NIST North is one such issue, and while the panel is glad to learn that their relocation in conjunction with the occupation of the AML is being considered, the panel will be watching to see whether this occurs and how ITL handles the time prior to AML’s completion. Another such issue is the growing use by industry of consortia and other private groups to set industry standards. The panel recognizes that this trend cannot be controlled by ITL, but it would like to see further consideration of internal policies on use of closed consortia and of ways to encourage open standards development. MAJOR OBSERVATIONS The panel presents the following major observations: The panel is impressed with the progress that has occurred in strategic planning in the Information Technology Laboratory (ITL), particularly in the emergence and acceptance of a framework under which laboratory activities operate. The framework includes an ITL Research Blueprint and ITL Program/Project Selection Process and Criteria. ITL has done a remarkable job of becoming more customer-oriented over the past several years. The panel applauds the laboratory’s efforts in outreach and notes that the progress reflects improvement in a whole range of areas, from gathering wider and more useful input to help with project selection to increased dissemination and planning for how customers will utilize NIST results and products. The strong customer relationships now need to be balanced by robust visibility and recognition in ITL’s external peer communities. Publications in top-tier journals, presentations at high-profile conferences, and awards from ITL’s peers will help confirm the technical merit of the work done at NIST and will add to the laboratory’s credibility with its customers. Conveying awareness of the social issues related to ITL’s technical work in areas such as biometrics is an important element of the credible presentation of ITL results to diverse audiences. In certain areas, considering the technical and social context of how the work will be used may help focus the research on the most appropriate questions. The shift of the information technology (IT) support functions to a new unit reporting directly to the NIST director is an opportunity and a challenge for NIST leadership. If this new unit can convince the NIST laboratories to embrace consistent, institutionwide standards for IT systems, it will be an important step and a major cultural shift at NIST. Appropriate emphasis is being placed on demonstrating how IT services can facilitate research and how standardizing basic applications can save time and money. The retirement of the current director of ITL is clearly a source of concern within the laboratory. The panel recommends that NIST leadership focus on communicating clearly with staff about the selection criteria for the director’s replacement and that it supply staff with frequent updates on the progress of the search and hiring process. Sharing of relevant information will certainly help the transition proceed more smoothly.

OCR for page 261
An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002 money and efficiency, of inconsistent or duplicative efforts—for example, the distributed systems administration, which is weakening computer security at NIST. Another example is contracts with external providers of services or software, where each individual unit negotiates its own rates and terms. The panel is pleased to see that the division is working very hard to fix both of these situations. The positive aspect of IT services being provided in the units is the presence of support staff who work closely with a specific set of researchers and understand deeply and well their needs and constraints. To take full advantage of the positive aspects of dispersed IT support personnel, the panel suggests that the division consider creating numerous separate, and perhaps unique, service-level agreements with the laboratories and research groups rather than a uniform approach to diverse NIST-wide needs. A range of agreements might reassure the laboratory staff that the division recognizes that each group has its own special needs, and then it might make the researchers more understanding of the instances in which some degree of uniformity is necessary (e.g., in security). Another path to encouraging conformance to standards and use of centralized services is that of making the centralized services more attractive than the alternatives. When the central mail services were improved recently, more people began using the main mail servers, and many of the small independent, unsupervised mail servers in the individual laboratories were able to be shut down. Security improved, and the amount of time spent NIST-wide on supporting e-mail services decreased. The recent launch of a centralized PC-buying service is another example of providing incentives for researchers to embrace a more efficient and uniform system. In the future, the division might consider consolidating server and storage functions so as to both improve responsiveness to the laboratories’ research agendas and reduce NIST’s overall costs and risks of lost data. The panel is pleased to learn that the new IT services unit will include a group focusing particularly on providing solutions to assist researchers in tackling unique scientific problems. The goal would be to help NIST staff in the laboratories utilize commercial off-the-shelf products relevant to their experiments and perhaps to create an “explorers group” that would investigate new software and hardware that might be applicable to NIST research. The panel supports this approach and hopes that it will help the IT support group engage the NIST research community in creative thinking about IT solutions and allow the scientists to realize the value that IT can add to their experimental work. It was not clear to the panel if the current structure through which the division receives advisory input from the laboratories is actually providing (or being perceived as providing) optimal opportunities for two-way communication. It is also possible that communications and relations between the IT services unit and its customers might improve with the increased stature of IT services that may result from its organizational shift from laboratory division to individual unit whose head reports directly to the NIST director. The panel notes that one consequence of the organizational change will be that the IT services are no longer reviewed by the National Research Council assessment panels, which review the programs under way in the NIST Measurement and Standards Laboratories, such as ITL. External assessments of programs drive self-evaluation as well as providing unbiased advice from different perspectives, and the panel recommends that NIST management explore ways for the new unit to receive this sort of input from outside the institution. Funding sources for the Information Services and Computing Division are shown in Table 8.7. As of January 2002, staffing for the division included 136 full-time permanent positions, of which 109 were for technical professionals. There were also 11 nonpermanent or supplemental personnel, such as postdoctoral research associates and temporary or part-time workers. In last year’s report, the panel encouraged division and laboratory management to look for ways to increase diversity at the management level. This year, as part of a reorganization of the division’s groups, competitions for several top-level positions were reopened in search of candidates, to bring new

OCR for page 261
An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002 TABLE 8.7 Sources of Funding for the Information Services and Computing Division (in millions of dollars), FY 1999 to FY 2002 Source of Funding Fiscal Year 1999 (actual) Fiscal Year 2000 (actual) Fiscal Year 2001 (actual)a Fiscal Year 2002 (estimated) NIST-STRS, excluding Competence 0.6 0.9 0.3 0.2 STRS—supercomputing 0.9 0.9 7.4 9.1 ATP 0.0 0.0 0.1 0.2 OA/NFG/CRADA 0.0 0.0 0.6 0.7 Other Reimbursable 0.4 0.6 1.0 0.3 Agency Overhead 7.1 8.2 18.2 25.8 Total 9.0 10.6 27.6 36.3 Full-time permanent staff (total)b 72 77 131b 136 NOTE: Sources of funding are as described in the note accompanying Table 8.1. aThe significant difference between the FY 2000 and FY 2001 funding and staff levels reflects the reorganization of ITL, in which information technology service groups were moved out of the Convergent Information Systems Division and into this division. bThe number of full-time permanent staff is as of January of that fiscal year, except in FY 2001, when it is as of March. people into the management ranks. Although few women and minority management candidates were ultimately identified, the initiative set an important precedent, and the panel supports further efforts in this area. Software Diagnostics and Conformance Testing Division Technical Merit The mission of the Software Diagnostics and Conformance Testing Division is to develop software testing tools and methods that improve quality, conformance to standards, and correctness; to participate with industry in the development of forward-looking standards; and to lead efforts for conformance testing, even at the early development stage of standards. The division’s work designing conformance and diagnostic tests and developing reference implementations for standards bodies clearly fulfills its mission and is consistent with the goals expressed in both the division and NIST missions. The division is organized in three groups: Software Quality, Interoperability, and Standards and Conformance Testing. The technical merit of the work of all three groups is quite high. The Software Quality Group develops methods to automate software testing, develops software diagnostic tools, and performs research in formal methods. Projects under way include work on automatic test generation, enterprise single sign-on, quantum information, interactive television, and health care information systems. In the automated test generation project, staff have drawn upon prior work in test harnesses, mutation testing, and specification-based testing. This project is relatively mature, and the panel expects that the current toolset will be transitioned to industry quite soon. The

OCR for page 261
An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002 staff have maintained an excellent relationship with Ford Motor Company over the course of this project, and the division should consider Ford and other companies as possible targets for this technology transfer. If the automated test generation project does continue at NIST, the division might consider adding formal test oracles to the toolset; these could be used for checking test results against specifications of expected and/or desired functionality or other quality properties. In the current toolset, the test data are generated on the basis of specification mutations that are used to provide an adequate range of tests for the software. According to the mutation testing paradigm, a mutation “test” is killed if it provides different results from the original test, but the original test’s “correctness” is determined by some other means. This approach is extremely costly and, if not automated, potentially error-prone. Using the formal specifications to develop automated test oracles would be an effective alternative. Since formal specifications are available in the domains of exploration (consortia and standards bodies), the use of formal specification-based test oracles would improve the quality of the toolset at a limited additional cost. The Interoperability Group works with other federal government agencies, with the voluntary standards community, and with industry to increase the use of publicly available standards in order to achieve and enhance interoperability. A primary role of this group is that of working with government groups, including the Federal CIO Council, in the application of standards and the development of interoperability tests for IT systems and products that cross several agencies. Current projects include the National Software Reference Library, computer forensics tool testing, and work on metadata standards. Group staff also serve as ITL representatives on a variety of standards committees. The Standards and Conformance Testing Group develops conformance tests and reference implementations, performs research into better ways to do conformance testing, and, working with industry, develops standards for emerging technologies. Currently, the primary focus areas of this group are XML and pervasive computing. In the XML area, the panel continues to be impressed with how well the division works with industry groups to establish means by which software and systems can interoperate over the Internet. In pervasive computing, a key component of the work is related to architectural description languages (ADLs), which can improve technical specifications of system architectures, especially for those systems in which dynamic adaptation and dependability are required. The panel commends the division’s decision to focus pervasive computing efforts on ADLs and simulation; elements of this project will be relevant to applications well beyond the context of pervasive computing. The panel has two suggestions about the ADL efforts. One is to consider whether xADL might be relevant to the project. Unlike traditional ADLs, xADL has an emphasis on dynamically reconfigurable architectures and is defined as a set of XML schemas. This approach gives xADL extensibility and flexibility, as well as allowing basic support by many commercial XML tools. The panel’s second suggestion is to consider expanding the intent of the ADL effort beyond improving and extending specifications and to include work on specification-based testing activities. A substantial amount of current work exists in the area of architecture-based testing, where the ideas behind specification-based testing are applied on the basis of formal architecture descriptions. Architecture-based testing is particularly useful in integration, conformance, and interoperability testing, because it is tied to the architectural design level. It is applicable in analysis, test planning, and test generation at the stage of specifying the architectural configuration and then equally applicable in actually testing the software during integration. This approach would complement and support several other division projects and hence seems worth exploring.

OCR for page 261
An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002 Program Relevance and Effectiveness The Software Diagnostics and Conformance Testing Division delivers value to users and providers of software through its facilitation of improvements in software quality and interoperability. The division develops products such as reference implementations and conformance test suites, provides technical leadership by chairing standards committees and participating in consortia, and lays the groundwork for overall advancements in this field by researching improved methods of conformance testing. NIST’s role as an active but neutral third party in standards processes, coupled with the outstanding quality of the conformance tests developed by this division, provides government and industry with a service that is both necessary and unique. The panel continues to be impressed by the division’s focus on emerging technologies and the effectiveness with which it partners closely with industry. Staff work well with a wide variety of organizations (e.g., the Organization for the Advancement of Structural Information Systems [OASIS], the Worldwide Web Consortium [W3C], the Air Transport Association [ATA]), and the division also works directly with individual companies, such as Ford Motor Company, Sun Microsystems, IBM, and Microsoft, on products and applications to improve the interoperability available to users. Overall, the division’s relationships with industry and industry groups are outstanding. The panel does note that the division’s focus on these important activities has limited the time and effort available for publications and presentations in what academics would consider the top-tier journals (e.g., IEEE or ACM transactions) and conferences (e.g., the International Conference on Software Engineering, the International Symposium on the Foundations of Software Engineering, and the International Symposium on Software Testing and Analysis). However, through its more general projects, the division does support various research communities. For example, the ADL work provides a common set of measurements to enable comparison and analysis across systems, and this clearly fulfills an important need of the ADL community. The federal government clearly benefits, as do all users, from the division’s work to improve the interoperability and performance of commercial software systems. However, the division also has a range of activities targeted directly at assisting a variety of agencies. The highest-profile projects are the National Software Reference Library and the computer forensics tool testing, which serve the law enforcement community at many levels. The work with the Federal CIO Council also continues to be important across government, and various projects are supporting individual agencies. For example, the work on health care information systems is being done in conjunction with the Department of Veterans Affairs. The division’s effectiveness is exemplified by the XML conformance project. In this effort, the division’s significant contributions to the standards process were critical to the success of XML as a truly “open” standard. While industry itself recognized the value of conformance tests, it was unwilling or unable to commit the resources needed to organize the development of a substantial set of tests for XML. A fledgling effort was established by an industry consortium to undertake this effort, but it failed to generate sufficient support. The division stepped into the partial vacuum created, led a revitalized effort, organized support by industry, and collected tests from a variety of sources. These actions facilitated the open discussion of conformance to the standard by major (and minor) suppliers of XML technology, and the division is primarily responsible for the overall success of the effort and the existence of the standard and the conformance tests that are necessary to allow the use of XML to flourish. W3C has now initiated a quality-assurance activity in this area, and the panel hopes that the division’s experience and expertise will be effectively utilized as industry moves forward on defining XML standards, testing, and usage. As can be seen in the example above, a key element of NIST’s effectiveness is the division’s good relationships with industry and its ability to work with industry groups such as consortia. Timing is a

OCR for page 261
An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002 critical factor, too, as the division’s impact on standards and software is dependent on getting involved early in the standards development process. However, to participate effectively in consortia often requires legal paperwork and agreements, and the division continues to be hindered by the poor responsiveness of the Department of Commerce legal department. The panel is particularly impressed with the division’s record both on beginning and concluding projects. Division staff have shown good judgment in selecting new areas in which to work, which is particularly impressive given the wide range of standards activities on which this division could potentially have an impact. The division’s philosophy of getting involved early in the standards process (focusing on emerging technologies) and partnering with industry maximizes the value of NIST’s work. In addition, the division has shown a willingness to discontinue work in an area if NIST’s contributions do not appear to be needed or if the technology is not being embraced as anticipated. Finally, the division is good at setting metrics and goals at the beginning of each project so that it will be clear when the objectives have been accomplished and it is time to conclude the project. This year, the Role-Based Access Control project and the Computer Graphics Metafile (CGM) project have been concluded, and the resources largely allocated to other projects. However, NIST staff will continue to support work in these areas when there is a specific industry request for their assistance. This willingness to provide ongoing support when necessary requires very little actual time from NIST staff but is important for ensuring successful technology transfer. Division Resources Funding sources for the Software Diagnostics and Conformance Testing Division are shown in Table 8.8. As of January 2002, staffing for the division included 37 full-time permanent positions, of which 33 were for technical professionals. There were also 15 nonpermanent or supplemental personnel, such as postdoctoral research associates and temporary or part-time workers. TABLE 8.8 Sources of Funding for the Software Diagnostics and Conformance Testing Division (in millions of dollars), FY 1999 to FY 2002 Source of Funding Fiscal Year 1999 (actual) Fiscal Year 2000 (actual) Fiscal Year 2001 (actual) Fiscal Year 2002 (estimated) NIST-STRS, excluding Competence 4.8 4.8 4.9 5.3 Competence 0.6 0.5 0.5 0.4 ATP 0.4 0.6 0.2 0.0 OA/NFG/CRADA 0.6 1.0 1.9 2.8 Total 6.4 6.9 7.5 8.5 Full-time permanent staff (total)a 39 37 35a 37 NOTE: Sources of funding are as described in the note accompanying Table 8.1. aThe number of full-time permanent staff is as of January of that fiscal year, except in FY 2001, when it is as of March.

OCR for page 261
An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002 The panel continues to be impressed by the high degree of camaraderie exhibited between and among managers and staff in this division. Various factors appear to contribute to the good morale observed. One is the strong sense of teamwork that exists; communication within the division is very good, and since there are no one-person projects, staff do not feel isolated. In addition, a very high percentage of the division’s activities involve collaborations with other ITL divisions, and these relationships expand division staff’s knowledge of various technical fields and are good for morale. The division has successfully recruited several new staff members in the past year, and the panel considers that the division is quite healthy. Only two concerns were expressed by staff in informal interactions with the panel. One is the difficulties experienced in dealing with the NIST procurement system; the delays in purchasing equipment are particularly frustrating in the IT arena, as the technology available changes rapidly and the division requires access to relevant hardware and software to have an impact on industry. The second concern was the division’s budget deficit as of February 2002; it was not clear where the funds necessary to make up this gap would come from. Statistical Engineering Division Technical Merit The mission of the Statistical Engineering Division is to advance measurement science and technology by collaborating on NIST multidisciplinary research, by formulating and developing statistical methodology for metrology, and by applying statistical principles and methodology to the collection and analysis of data critical to NIST scientists and engineers. The division is involved in a broad range of activities, including the provision of support to NIST scientific research, collaborative multidisciplinary research with NIST scientists, development of new statistical methodology with a special focus on metrology, and the transfer of statistical methodology to NIST scientists and the broader scientific community. The demand for collaborative interactions with division staff continues to be very high. Less than 2 years ago, a new division chief was hired; her primary task has been to rebuild the Statistical Engineering Division into a premier national resource for statistical sciences. Great progress has been made on this task, and continuing efforts are essential to allow the division to keep pace with the statistical demands arising from new technologies that are being applied to measurement systems and metrology. The health and activities of this division are crucial elements of the success of future NIST research. The Statistical Engineering Division is located primarily at NIST Gaithersburg, where the staff are split into two groups: the Measurement Process Evaluation Group and the Statistical Modeling and Analysis Group. In addition, a group of staff from this division work at NIST Boulder where they are close to collaborators in the EEEL, CSTL, PL, and MSEL divisions located on that campus. Projects in a wide variety of fields are currently under way in the Statistical Engineering Division. Below, the panel describes several ongoing activities, but these efforts are just a few examples of the division’s many successful projects. The project highlights discussed in this section fall into three categories: Bayesian methodology, uncertainty analysis for key comparisons, and uncertainty analysis for process measurements. In the first area, the division has made major contributions through its work on Bayesian metrology. A fundamental problem in metrology is the assessment and assignment of realistic uncertainty to measurement results. In many complex problems, such as the analysis of high-throughput measurements, high-dimensional data, and complex dynamical systems, it is important to combine expert knowledge and prior information with physical measurements. The researchers in the Statistical Engineering Division

OCR for page 261
An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002 have adopted the Bayesian framework to solve these sorts of problems. This framework provides a scientific basis and formal approach to utilizing scientific knowledge and prior information to yield better design of experiments and testing strategies. The application of this approach has produced a number of key achievements in the past year in the areas of interlaboratory intercomparisons, international key comparisons among national measurement institutes (NMIs), elicitation of prior information to calculate uncertainties, and development of nonparametric Bayesian models using empirical distributions. One example of the important applications of Bayesian methodology is the international study of the sublethal effects of fire smoke on survivability and health, a joint project between the division and the NIST Building and Fire Research Laboratory. The goal was to obtain consensus values and uncertainty measures for lethal and incapacitating toxic potency values for large numbers of building materials based on data from the many different studies that have been published. The challenge was that the quality of the data varied greatly from study to study. Statistical Engineering Division researchers developed a Bayesian hierarchical model to combine the data from different studies with and without uncertainty measures by constructing vague priors at the lowest level of the hierarchy. The results of this project have had a large impact on the building industry at both the national and international level, and the success of the method attracted a great deal of interest from other researchers at NIST. In the area of key comparisons and uncertainty analysis, the division has taken a lead in international efforts to establish equivalence among the many national standards organizations throughout the world. The mechanism for these efforts is the Mutual Recognition Agreement among the NMIs and regional metrology organizations that belong to the International Committee for Weights and Measures (CIPM). In this work, the greatest challenge for the division has been to develop a set of sound statistical design and analysis procedures to be used in interlaboratory studies for establishing equivalence of national standards. Key comparisons have five critical phases: (1) agreement among NMI scientists on the specific transfer standard (and/or measurement process), (2) design of the multinational experiment, (3) data collection at each NMI, (4) determination of the reference value and assessment of standard uncertainty at each NMI, and (5) determination and reporting of the level of equivalence among the participating NMIs and the related uncertainties. In the past year, division statisticians have developed a unified approach to experimental design and analysis to be applied in the work on key comparisons. Facilitation of key comparisons is an important element of NIST’s support of the United States in the recent trend toward open markets and globalization. The value of the division’s expertise in data comparison can be seen even in comparisons that predate the MRA. Data for comparison of laboratories’ realizations of the International Temperature Scale were collected over several years by 15 laboratories around the world prior to the signing of the MRA. As a result, no information was available as to whether the submitted components of uncertainty were for an individual measurement or for the mean of replicated measurements. Other problems related to determining how each uncertainty component contributed to the measurement error of the process, what uncertainties were associated with the standard platinum resistance thermometers used as transfer instruments, how to appropriately compute coverage factors to obtain expanded uncertainties with correct confidence levels, and, finally, how to explain the effects arising from diverse paths for computing temperature differences across subsets of laboratories. Working jointly with the NIST Chemical Science and Technology Laboratory (CSTL) and two laboratories in Germany and Australia, the Statistical Engineering Division was able to overcome all of these challenges and to produce useful results from the comparison data. These results are having a significant impact on the international temperature standards and on the sales of temperature-related equipment or services between different countries. The success of this key comparison makes it a role model for future comparisons.

OCR for page 261
An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002 Another project that is a benchmark example of what is possible in a key comparison is the division’s work on statistical uncertainty analysis for the comparison of resistance standards. The NIST Electronics and Electrical Engineering Laboratory (EEEL) and 14 other NMIs participated in the CIPM Consultative Committee for Electricity and Magnetism’s Comparisons of Resistance Standards. Two types of dc resistors were used as traveling standards and were measured by different laboratories at different time periods. Systematic drifts of the traveling standards and laboratory measurement uncertainties were the main causes of discrepancies among the measurements. However, the Statistical Engineering Division staff developed an accurate statistical model based on linear regression to combine the measurements, and this model now can provide a basis for calibrating the high-resistance standards of the laboratories’ customers. Another important contribution of the Statistical Engineering Division is the study of uncertainties associated with process measurements. Data from these types of measurements, such as fluid-flow measurements, information about high-speed optoelectronic signals, and measurement of spray characteristics, occur frequently in the NIST laboratories. In the past year, the division staff have worked collaboratively with other NIST scientists on solving difficult problems, using expertise in statistical signal processing, time series, and statistical smoothing techniques. One such joint project is with the CSTL on flow measurements for multimeter transfer standards. The result of this work is the implementation of an in-house prototype system to evaluate in detail the behavior of dual meter systems. With new understanding from this prototype, an efficient experimental design has been tested and then modified specifically for the international key comparison setting. In conjunction with the experimental design development, methodology for data analysis has been put into place for the initial key comparison, for which NIST is the pilot laboratory. The time spent on this project was highly leveraged, as the new protocol is serving as the prototype for the other five areas under study in the CIPM’s Working Group for Fluid Flow and as the basis for all future international flow comparisons. The importance of the division’s expertise and experience in statistical issues related to signal processing can be seen in the work with EEEL on high-speed optoelectronic measurements. Division staff have developed state-of-the-art statistical signal processing techniques to reduce the random component of the timing error and the systematic component of time-base distortion in these measurements. Using a regression spline model, the average of the aligned signals is interpolated onto an equally spaced time grid based on estimated time-base distortion, and the resulting power spectrum is then corrected for jitter effects by an estimated multiplier. The laboratory’s new measurement capability will be used to support industrial applications in the areas of gigabit ethernet networks, fiber channels, optical telecommunications, and wireless communications. The results have been published in IEEE Transactions on Instrumentation and Measurements. Program Relevance and Effectiveness As demonstrated by the preceding examples, the efforts of the Statistical Engineering Division have a broad impact on the work of the NIST scientists and engineers with whom the division collaborates. Since NIST’s mission and activities focus on measurement science, the division staff have developed some capabilities that are unique within the scientific community. These capabilities include expertise and experience in techniques for statistical uncertainty analysis for measurement science (i.e., measurement processes) and statistical methods for metrology. These are important capabilities, and the panel firmly believes that the division can and should play a pivotal role in NIST’s support of U.S. industry by promoting industrial statistics and by helping to link key statistical groups in academia and industry and

OCR for page 261
An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002 at U.S. national laboratories. For the past few years, the division has appropriately focused on rebuilding its reputation and maintaining a strong portfolio of work within NIST (and ITL), but now that the division is growing stronger, increased focus on external relationships and responsibilities should be the natural next step. Methods for technology transfer is one area in which external organizations might benefit from learning more about the Statistical Engineering Division’s projects and approaches. The division staff have an excellent reputation for their ability to turn research projects into standard methods, tools, and software that can be easily used by NIST scientists. This is a type of technology transfer, and these products are particularly valuable because they continue to raise the statistical competency of NIST scientists while avoiding a pitfall commonly experienced in other organizations—that is, scientists becoming dependent on statisticians for experimental analysis, thereby limiting the time that the statisticians have to push the leading edge of statistical research and development. Not only are NIST statisticians adept in technology transfer within NIST, but division staff are also actively involved in general education about statistical uncertainty and promote statistical methodologies by offering short courses and workshops and by producing the Web-based NIST/SEMATECH Engineering Statistics Internet Handbook.13 Other statistics organizations in industry and at the national laboratories struggle with technology transfer and could learn a great deal from the division’s successes in this area. The Statistical Engineering Division plays an important role in the national and international metrology communities through contributions to documents and handbooks of standard methodology, promotion of statistical approaches to metrology, collaboration on international experiments, participation in international metrology organizations, publications in leading journals in metrology, and assistance to other NMIs through training and collaborations. The value of the division’s work is known and appreciated throughout relevant physical sciences communities as a result of the dissemination efforts mentioned above and the large number of publications that result from the division’s collaborative efforts and which appear in subject-matter scientific journals. However, the statistical sciences community should also be benefiting from the division’s unique expertise and experiences. Division staff do publish in statistics journals and present at statistics conferences, and the panel was pleased to see a surge in dissemination of the division’s research in peer-reviewed journals, as suggested in last year’s assessment report. Nonetheless, the burden is still on the division to facilitate stronger interactions with the statistical sciences community as a whole. One element of achieving higher visibility in the statistics discipline and maximizing the division’s impact on the statistics community should be that of generalizing the methodologies from specific problems to show their relevance to solving a wider range of similar problems and making these results broadly available through publications in leading statistical journals. Division Resources Funding sources for the Statistical Engineering Division are shown in Table 8.9. As of January 2002, staffing for the division included 19 full-time permanent positions, of which 17 were for technical professionals. There were also 12 nonpermanent or supplemental personnel, such as postdoctoral research associates and temporary or part-time workers. 13   The NIST/SEMATECH Engineering Statistics Handbook is available online at <http://www.itl.nist.gov/div898/handbook/index.html>.

OCR for page 261
An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002 TABLE 8.9 Sources of Funding for the Statistical Engineering Division (in millions of dollars), FY 1999 to FY 2002 Source of Funding Fiscal Year 1999 (actual) Fiscal Year 2000 (actual) Fiscal Year 2001 (actual) Fiscal Year 2002 (estimated) NIST-STRS, excluding Competence 2.9 3.0 3.4 3.7 Competence 0.5 0.6 0.3 0.5 STRS—supercomputing 0.1 0.0 0.0 0.0 ATP 0.0 0.0 0.2 0.3 Measurement Services (SRM production) 0.0 0.0 0.1 0.5 OA/NFG/CRADA 0.1 0.1 0.1 0.4 Total 3.6 3.7 4.1 5.4 Full-time permanent staff (total)a 23 19 17a 19 NOTE: Sources of funding are as described in the note accompanying Table 8.1. aThe number of full-time permanent staff is as of January of that fiscal year, except in FY 2001, when it is as of March. As noted above, the progress evident since the hiring of the new division chief in the fall of 2000 is extraordinary. Morale is now high, top-rate visiting researchers have been engaged to fill technical gaps, and new junior personnel have been hired (in spite of difficulties due to the governmentwide hiring freeze imposed by the change in administration). It is clear that the division is moving quickly in exactly the right direction. However the division’s recovery is still at a somewhat fragile stage, and continued support from ITL and NIST management will be needed to fulfill the division’s long-term goals. A 5-year plan is in place to increase the division’s full-time permanent technical staff from 16 to 24, and the division is on track in year 2 of the plan. The growth should be a combination of junior and senior hires, and, to continue to move forward, the division will require a commitment from management to steadily increase the division’s core support. The Statistical Engineering Division has done well in competitions for internal research and development funds, and this year has seen a significant increase in the division’s activities related to the production of standard reference materials (SRMs). This growth is due primarily to the staff’s outstanding development of educational courses for outreach to NIST researchers, who have in turn realized the opportunities presented by working with the division on SRM-related projects. The panel believes that the staff should aggressively continue these outreach activities, which produced the surge in national and international SRM work. The panel continues to be concerned about the relative isolation of the Statistical Engineering Division in its current location at NIST North. The issues related to this recurring concern have been discussed at length in many past assessment reports. If NIST is to obtain the maximum value possible from the division, the panel strongly urges NIST management to consider relocating this division to the main campus. If relocation is not an option in the near term, NIST management should actively work with ITL and division management on other creative approaches to solving this problem.

OCR for page 261
An Assessment of the National Institute of Standards and Technology Measurement and Standards Laboratories: Fiscal Year 2002 This page in the original is blank.