National Academies Press: OpenBook
« Previous: General Assessment of the Information Technology Laboratory
Suggested Citation:"Assessments of Laboratory Divisions." National Research Council. 2007. An Assessment of the National Institute of Standards and Technology Information Technology Laboratory: Fiscal Year 2007. Washington, DC: The National Academies Press. doi: 10.17226/12012.
×
Page 11
Suggested Citation:"Assessments of Laboratory Divisions." National Research Council. 2007. An Assessment of the National Institute of Standards and Technology Information Technology Laboratory: Fiscal Year 2007. Washington, DC: The National Academies Press. doi: 10.17226/12012.
×
Page 12
Suggested Citation:"Assessments of Laboratory Divisions." National Research Council. 2007. An Assessment of the National Institute of Standards and Technology Information Technology Laboratory: Fiscal Year 2007. Washington, DC: The National Academies Press. doi: 10.17226/12012.
×
Page 13
Suggested Citation:"Assessments of Laboratory Divisions." National Research Council. 2007. An Assessment of the National Institute of Standards and Technology Information Technology Laboratory: Fiscal Year 2007. Washington, DC: The National Academies Press. doi: 10.17226/12012.
×
Page 14
Suggested Citation:"Assessments of Laboratory Divisions." National Research Council. 2007. An Assessment of the National Institute of Standards and Technology Information Technology Laboratory: Fiscal Year 2007. Washington, DC: The National Academies Press. doi: 10.17226/12012.
×
Page 15
Suggested Citation:"Assessments of Laboratory Divisions." National Research Council. 2007. An Assessment of the National Institute of Standards and Technology Information Technology Laboratory: Fiscal Year 2007. Washington, DC: The National Academies Press. doi: 10.17226/12012.
×
Page 16
Suggested Citation:"Assessments of Laboratory Divisions." National Research Council. 2007. An Assessment of the National Institute of Standards and Technology Information Technology Laboratory: Fiscal Year 2007. Washington, DC: The National Academies Press. doi: 10.17226/12012.
×
Page 17
Suggested Citation:"Assessments of Laboratory Divisions." National Research Council. 2007. An Assessment of the National Institute of Standards and Technology Information Technology Laboratory: Fiscal Year 2007. Washington, DC: The National Academies Press. doi: 10.17226/12012.
×
Page 18
Suggested Citation:"Assessments of Laboratory Divisions." National Research Council. 2007. An Assessment of the National Institute of Standards and Technology Information Technology Laboratory: Fiscal Year 2007. Washington, DC: The National Academies Press. doi: 10.17226/12012.
×
Page 19
Suggested Citation:"Assessments of Laboratory Divisions." National Research Council. 2007. An Assessment of the National Institute of Standards and Technology Information Technology Laboratory: Fiscal Year 2007. Washington, DC: The National Academies Press. doi: 10.17226/12012.
×
Page 20

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Assessments of Laboratory Divisions MATHEMATICAL AND COMPUTATIONAL SCIENCES DIVISION National and Agency Priorities The Mathematical and Computational Sciences Division (MCSD) has a well- formulated view of the way it contributes to national priorities by advancing science and industrial innovation, and individual contributors understand how their work fits with the goals of NIST. In particular, the teams understand the importance of simulation-based engineering and how to deliver their technology to scientists and practitioners, inside and outside NIST, who are the ultimate users. Two projects clearly address national priorities: Numerical Optimization of Complex Instrumentation uses mathematical methods to determine appropriate settings for spectroscopy instruments, enabling less subjective forensic analysis. The work here has already improved standardization and repeatability of analyses, resulting in a more robust process. The OOF2 (Object-Oriented Finite Element Analysis) project on image-based finite-element analysis of material microstructures has evidently met its goal of system portability, based on what enthusiastic users say. It is contributing to advances in materials science, particularly the development of novel alloys. MCSD staff have the scientific expertise required to make significant contributions to this technical thrust. Their research publications, numerous collaborations, and education level (72 percent hold Ph.D.’s) attest to their readiness to perform their mission. Technical Merit Researchers have plenty of collaborators, are well published, and participate in scientific conferences and workshops. Overall, the technical merit of the projects measures up to the current state of the art. The projects reviewed have well-defined end users and a clear idea of how they will succeed. Typical projects include these: Modeling of Rheological Properties of Suspensions uses up-to-date scientific computing methodology to help understand the properties of cement-based materials. The project is well connected to worldwide industry. The Digital Library of Mathematical Functions project cleverly chose contour-fitted grids to produce proper visualizations of special functions. Such attention to detail will contribute to a worthy modern successor to Abramowitz and Stegun’s widely used Handbook of Mathematical Functions. Recent publications indicate that MCSD’s work in quantum computing is of the highest caliber. 11

The technical merit of MCSD work is also recognized by scientific colleagues. A member of the staff won the prestigious Arthur S. Flemming Award given to young federal agency employees. Two members of MCSD are now fellows of the American Physical Society, and one was recently honored as a distinguished scientist by the Association for Computing Machinery (ACM). Facilities and Staff Facilities appear sufficient for the research activities. Most projects are doing well enough with existing computers; one project with greater needs was allocated a large amount of supercomputer time. MCSD’s move from the adjunct campus to the main NIST campus will help collaboration. Individual researchers have adequate two-person offices. The work of MCSD requires tremendous individual concentration not usually attainable in multiperson offices, so if space constraints dictate a more compressed working environment, ITL management should support alternatives such as telecommuting part-time in order to preserve or even enhance the work environment. Morale is reasonably high, and the staff are clearly confident of their research directions and management. They are not risk-averse in the sense that members are quite willing to forge partnerships outside ITL and outside NIST. ITL staff expressed some concern over hiring and budget shifts. Also, staff and management expressed a desire for more postdoctoral positions and are taking steps to obtain necessary resources; more postdoctoral hires would be good for MCSD. Lastly, ITL should plan carefully to develop program managers and successors for the division management. INFORMATION ACCESS DIVISION The Information Access Division (IAD) within ITL has many long-standing, externally funded challenge programs in speech, text, and image processing, aimed at improving national technical capabilities by enticing industry and academia to address problems of interest to the government. Three examples of such programs are the Text Retrieval Conference (TREC), the Language Recognition Evaluation (LRE), and the Face Recognition Vendor Test (FRVT). There are no other external programs of such depth and maturity, and these challenge programs attract a wide variety of international participants. The other IAD challenge problems—Speech Recognition Evaluation, Spoken Term Detection, TREC Video, Document Understanding Conference, Fingerprint Minutiae Interoperability Exchange Test, Machine Translation Evaluation, and ACQUAINT—also have no peers. As a whole, they represent an important national asset. These programs deserve to be fostered and protected. While the stability of government programs should not depend on the individuals assigned to them, a stable group of mature professionals must be maintained for the efficient execution of these complicated technical challenges. When converting to a matrix management structure, ITL should avoid disrupting the established IAD challenge programs and should seek to keep the expertise in place. IAD has been asked to establish standards and evaluation programs for a number 12

of human-centric technologies, such as voting machines, machine text translation, and content-based access methods. “Human-centric” means that humans are either part of the process or direct consumers of the machine output. The performance of such technologies must be assessed in the context of human cognition and physical capabilities. Standards development and evaluations, particularly of technical systems to be used by the public, such as voting systems, must be done with a focus on universal accessibility. (“Universal accessibility” is to be understood here in the broadest possible sense, including young, middle-aged, old; tall, short; left-/right-handed; and near-/far- sighted) Such evaluations require professionals with expertise outside the traditional ITL disciplines of mathematics, statistics, engineering, physics, and computer science. Recognizing this, IAD has been hiring and developing expertise in the social sciences, but it should prepare for even more staff growth in these areas. ITL should strive to remove barriers to hiring professionals outside the traditional mathematical and engineering sciences. STATISTICAL ENGINEERING DIVISION Addressing National Priorities The SED research program plays a significant role in ITL’s work in support of national priorities. The division’s statistical metrology effort is a unique national resource. Its expertise on issues related to measurement, including novel work on Bayesian methods to combine information about both statistical and nonstatistical sources of error, is an important asset. In addition, SED personnel play a vital role in collaborative efforts with other NIST programs, helping those programs to address relevant national priorities—for example, a collaborative project with the NIST Center for Neutron Research supports development of hydrogen fuel cells. SED continues its long-standing vital contribution to NIST’s manufacture of standard reference materials (SRMs), with every SRM requiring SED validation. Finally, SED now plays a significant role in support of national needs expressed by other U.S. government departments and agencies. Noteworthy in this regard is an ongoing project for the Department of Homeland Security (jointly with personnel from the NIST Physics Laboratory) on evaluation of radiation detectors as part of the effort to protect the United States against nuclear terrorism. Impact of Programs The SED research program has a clearly identified mission that emphasizes (1) support and collaboration for NIST research efforts; (2) participation in international metrology efforts; (3) support for the NIST SRM program; (4) participation in projects for outside agencies; and (5) education and outreach to teach about uncertainty and to describe measurement work within NIST, including interdisciplinary collaborations with the physical sciences and assuming a leadership role in the international metrology community. Success in these efforts has a clear link to the national priorities being addressed by NIST to enhance industrial competitiveness. 13

Technical Merit The statistical metrology research program is clearly state of the art. SED researchers publish regularly in international metrology journals and are among the leaders in international metrology efforts. One member of the staff was recently appointed as a permanent member of the primary international measurement group’s committee that promotes statistical tools in measurement. SED has been recognized as the largest statistical team focusing on metrology. The collaborative research with other NIST laboratories is also making use of state-of-the-art statistical methodology and stands as a strong example of the potential for science-motivated collaborations to advance the development of statistical methodology. For example, SED’s innovative work in experimental design ensures that NIST investments in experimentation and evaluation are as cost-effective as possible and will achieve the experimental goals. There is a need to expand SED’s ability to bring state-of-the-art methodology to the many NIST projects that can use its support. This can be done by increasing the size of the division, as discussed below. Facilities, Equipment, and Human Resources The SED scientists appreciate the move that has colocated SED staff on the main NIST campus and note that it has important benefits for their collaborative work. The computational infrastructure is appropriate to the tasks the group performs. SED needs additional human resources. The work being done is, as described above, significant and of high quality. SED staff reported that because of resource limitations they are being forced to choose among important projects. The SED scientists remain committed to their long-term mission of providing collaborative support for the projects across NIST. This has always been a difficult challenge (there are currently approximately 20 scientists in SED and more than 2,000 in NIST). New opportunities— for example, to develop standards for microarray studies in biology and to participate in crosscutting intra-ITL programs—are important to the national metrology effort and of interest to SED staff. However, the scientific staff at SED noted that participation in such efforts requires cutting back on their collaborative efforts with other NIST laboratories and with groups external to NIST. For example, the Metrology Group’s focus on participating in international metrology efforts has meant that it has not been able to participate in some standards projects for the American Society for Testing and Materials. Additional staff are needed in SED and would represent an appropriate ITL investment. ADVANCED NETWORK TECHNOLOGIES DIVISION The Advanced Network Technologies Division (ANTD) contributes in a number of networking areas, all of which are of growing importance to the nation’s competitive position in the world, as well as to the safety of its citizens. As part of securing the cyberspace infrastructure, it has stepped up to authoring the protocol standards that would make the naming and routing services of the Internet more secure and is pushing their deployment. It has responded to the needs of the various agencies of the U.S. 14

government by producing a guide for deploying IPv6, as it has been mandated to do by the Office of Management and Budget. In the ever-important area of communications between different public safety radio systems, ANTD has assisted the Department of Homeland Security with P25 intersystem specifications and hands-on interoperability testing (P25 is a suite of standards that specify the interfaces between the various components of a land mobile radio system). In its mission of developing measures for new technologies, the division’s renowned work on quantum key distribution has produced a device for measuring the efficiency of any photon detector, and work with entangled photons is on the horizon. The broad area of complex information systems, a new program involving scientists from multiple divisions within ITL, has uncovered surprising causality between simple actions of network components and chaotic network events. ANTD’s various wireless projects are timely for industry and the nation, addressing a mix of public safety and private sector concerns. Both the clever use of radio technologies for tracking and identifying people inside a building and the development of methods for rapid deployment of ad hoc wireless networks based on vector quantization of received signal strength build on the strong wireless expertise in this division. The division could do even better work if it had access to certain external research networks connected independently of the NIST campus network for security reasons. The IPv6 work lacks credibility without having a network attachment to the worldwide interconnected networks running IPv6, and when the National Science Foundation’s (NSF’s) Global Environment for Network Innovations (GENI) project goes live (the goal of the GENI project is to increase the quality and quantity of experimental research outcomes and transitions in networking and distributed systems), NIST would be conspicuously absent without a link to it. This is an ongoing problem that needs to be solved. Also, there are several opportunities for applying massive computing to some very important networking problems. For example, doing sufficiently accurate simulations of network behaviors, with enough breadth of experiment for statistical credibility, could benefit from orders of magnitude more computing than is currently available at NIST. Institutional commitments to internal investment as well as participation in the use of national resources could be most useful. COMPUTER SECURITY DIVISION NIST’s mandated role in support of FISMA requires significant effort and the development of many standards documents. FISMA was created as a way to protect federal information systems, but it also involves the creation by federal agencies of a great deal of documentation. The effectiveness of certification and accreditation processes such as those mandated by FISMA in bringing about effective security and protection is far from universally accepted by computer-security professionals. Although certification and accreditation processes play a significant role in computer security today, standards and guidelines should have their success firmly demonstrated. ITL has not demonstrated the success of its guidelines, and its customers may have reservations. 15

Cryptography ITL is a respected leader in cryptography competitions, and its work in this area is first-rate. The NIST-led competition that resulted in the creation of the Advanced Encryption Standard (AES) is well respected. The Computer Security Division’s (CSD’s) Encryption Group merits a high level of support to ensure the continued success of ITL’s cryptographic standards work. One criticism is that NIST competitions have had shifting criteria for success. Competitors have sometimes lobbied for adding new criteria or valuing one criterion over another, and this lobbying introduces an additional dimension to the competition, one that is subjective and not as transparent as might be desirable. ITL is planning to conduct a competition for a new generation of cryptographic hash functions, and there should be clear criteria for the evaluation of candidate hash functions. Research on quantum computing and its impact on cryptography is speculative. It is appropriate to conduct such research at NIST, which has qualified researchers investigating the area. Though it is not certain that quantum computing is possible, its success would require the development of new cryptographic standards. Voting Voting is a difficult area within which to develop standards, and NIST’s impact may be minimal in this area. NIST is probably in a good position, however, to gather information about the multitude of ballot types and to develop a generic ballot description language. Policy Machine NIST has initiated a project in pursuit of a standardized access control mechanism, referred to as the Policy Machine (PM). The project takes a very generic approach to policy management, but the architecture seems to include a centralized decision-making component that is unnecessarily constraining. Project members insist that it could be decentralized with commonly used techniques, but this seems to ignore the fact that some decisions must be made locally (within a small administrative domain). It is not clear that this is an area amenable to standardization. Nonetheless, it is forward- looking research in an important area that might bear fruit. The Internet Engineering Task Force has some policy management efforts—for example, IPsec (Internet Protocol security) policy based on an extensive and detailed mapping of IPsec standards to management information bases—some companies build security policy management methods on Microsoft’s access control mechanisms, and almost every security research conference includes a few papers on policy management. ITL’s work should be more heavily tied in to this outside work. Also, the personal identity verification (PIV) work would seem a natural fit for the application programming interfaces being developed for the PM. Tying together the authentication with the authorization would be a strong argument for the utility of the PM. 16

Personal Identity Verification NIST is making important contributions to interoperability by standardizing the critical components of personal identity verifications (PIVs). There are a surprising number of critical components surrounding PIVs, and standardization of information transfer and compliance testing are good ideas. This is solid work at the core of ITL’s competencies. The intent to investigate ontologies is a good idea that might bring some order to the impending chaos of identity methods. The efforts to support timely issuance are laudable. ITL should be looking at privacy, though, including consideration of the consequences of requiring a large amount of personal information to be carried on physical tokens, stored on computers at many government installations, and/or handled by contractors. The radio frequency identification (RFID) work considers a plethora of government policies and recommendations for privacy. Radio Frequency Identification The testing for eavesdropping and jamming is important work. It includes cryptographic methods to protect RFID tags and the systems that read them, and this is a difficult problem. There are some research papers published in proceedings of conferences on topics like cryptographic hardware and embedded systems; ITL should make its presence in the area more noticeable through outside associations. Security Testing and Metrics The requirements of NIST’s Federal Information Processing Standard 140 (FIPS-140, Security Requirements for Cryptographic Modules) are well respected by vendors of cryptographic systems. These standards are important for assurance in cryptography for U.S. and foreign systems. ITL’s role in validating laboratories that test products for compliance is appropriate and well conducted. The standards do have a reputation for being somewhat daunting to newcomers, and vendors may delay the effort to achieve compliance until they can afford to hire a knowledgeable consultant. Over 90 percent of first-time applicants fail because their documentation is inadequate. One might conclude that the standards do not adequately convey what is needed, and ITL should consider improving the explanations. A smaller percentage (approximately 30 percent) fail the algorithm testing. ITL provides a very good tool to the testing libraries for checking functionality. If that tool were also available to applicants, they could easily do their own functionality testing and spend less time interacting with the testing laboratory. ITL should publish the testing tool as an open source project. Automated Combinatorial Testing This work is largely repetitive of efforts conducted during the 1980s, and it is unlikely to lead to effective processes for assessing the security of real software systems. ITL should expand the work and try to apply it to mature NIST testing programs. In 17

particular, the crypto-validation program is a potential customer. As an example, the digital signature validation implementation error that was discovered in 2006 may have been the sort of error that the combinatorial testing could have discovered. The encoding of a digital signature involves a small grammar with a length field as an element. Combinatorial testing applied to the grammar might have developed tests for proper encoding of the length, and this might have revealed the implementation error long before visual inspection finally did. SOFTWARE DIAGNOSTICS AND CONFORMANCE TESTING DIVISION Addressing National Priorities Several of the Software Diagnostics and Compliance Testing (SDCT) division’s programs address important national priorities. The Help America Vote Act of 2002 (voting support), XML standards, computer forensics, and health information technology projects are among the most notable. ITL’s role in electronic voting is paramount for ensuring that new voting systems perform as they should. The SDCT division is providing support to the Election Assistance Commission’s Technical Guidelines Development Committee and works with voting officials, voting system vendors, and academic researchers to better understand the critical issues and possible approaches. The excellent SDCT division work that furthered the success of XML-related standards clearly is beneficial to information technology, and the particular standards are likely to improve national economic efficiency by permitting the better integration of diverse processes. The computer forensics project supports law enforcement agencies, particularly the National Institute of Justice, in their investigations of computer-related crimes. The health information technology project gives ITL the opportunity to contribute to the national agenda by addressing one of the most pressing problems in health care technology today, namely, methods to ensure measurable, confidential, and secure exchange of pertinent health care information. The Degree to Which Projects Are Well Motivated SDCT projects are well defined. Each clearly states the problem and the approach to it. Once a project has achieved its goals, it is subject to sunsetting and transfer to an appropriate industry or agency partner. The XML technologies conformance testing project is such an example. The SDCT team developed a comprehensive set of test suites for XML, which has been widely recognized for its high quality. The work has had a broad national and international impact, and it serves as a model for other testing and conformance efforts, which are well placed within this division’s portfolio. While the XML project has been a success and is properly being sunsetted, it is not obvious how such decisions are made for other projects and whether a formal technology transfer plan is in place in all cases. SDCT project work is well received by colleagues in the federal agencies with whom division staff interact. Staff regularly prepare technical reports, but the impact and visibility of their work would be enhanced by more extensive publication in the peer-reviewed literature. 18

Technical Merit of the Programs The technical merit of SDCT division programs is high, and the projects are conducted by technically competent individuals. Several of the projects that could have a very high impact are briefly discussed here. The voting project has a good specification and testing plan, and SDCT division staff are working with the relevant stakeholders to assess and define the needs of a trustworthy electronic voting infrastructure. It was not clear whether the security needs have been fully defined, and some collaboration with the security group within ITL could be desirable. The computer forensics project is the gold standard for enabling the exclusion of known packaged software from forensic analysis. That is, the signatures of many common pieces of software have been identified. However, in order to be fully successful, the project would need to scale to a much larger number of software packages and libraries, and it would need to include downloaded software, which accounts for most of the software purchased or upgraded today. Either significant growth or a plan for technology transfer might be considered for this project. The computational grid project is addressing an important problem, but the experiments to date consider only networks of 1,000 or fewer machines. This project should consider the trend to larger grids. The software assurance metrics and tool evaluation (SAMATE) project is also addressing an important problem and has catalyzed efforts elsewhere—for example, the National Vulnerability Database, Cigital, and Symantec—but to stay relevant, ITL must scale up the size of the software examples. Even if some tools are unable to handle the larger code snippets, ITL should lead the definition of benchmarks that would guide industry toward the problems of direct interest to increase overall assurance of future systems. The health information technology project has made some good progress in outreach and visibility in important national health care standards organizations and multidisciplinary medical associations such as the American Telemedicine Association. SDCT division staff have taken on a leadership role in the Integrating Healthcare Enterprise (IHE), an external group that promotes the coordinated use of existing standards in health care. The potential impact of ITL’s work in formulating standards for health care information technology standards is great. ITL is well positioned to play an important role in these efforts, but it would be desirable to have a mid- to long-term program roadmap and strong health informatics leadership commensurate with this broad and complex opportunity. Adequacy of SDCT Division Resources The facilities, equipment, and human resources are adequate for the work performed by the SDCT division. Many of the projects in the division are externally mandated and funded with targeted dollars. One way to foster innovative investigator- initiated research might be to grant the division some percentage of unconstrained funding each year. These funds would be dispersed at the discretion of the senior management and could be used to jump-start a few fledgling projects. One issue of concern is that the research equipment is subject to sometimes 19

onerous information technology regulations that interfere with carrying out the research mission. Also, overall administrative overhead seems to be increasing, with multiple new forms and regulations burdening the relatively small administrative staff. 20

Next: Conclusions »
An Assessment of the National Institute of Standards and Technology Information Technology Laboratory: Fiscal Year 2007 Get This Book
×
Buy Paperback | $21.00 Buy Ebook | $16.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF
  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!