National Academies Press: OpenBook

An Assessment of the National Institute of Standards and Technology Information Technology Laboratory: Fiscal Year 2009 (2009)

Chapter: 2 General Assessment of the Information Technology Laboratory

« Previous: 1 Charge to the Panel and the Assessment Process
Suggested Citation:"2 General Assessment of the Information Technology Laboratory." National Research Council. 2009. An Assessment of the National Institute of Standards and Technology Information Technology Laboratory: Fiscal Year 2009. Washington, DC: The National Academies Press. doi: 10.17226/12768.
×
Page 5
Suggested Citation:"2 General Assessment of the Information Technology Laboratory." National Research Council. 2009. An Assessment of the National Institute of Standards and Technology Information Technology Laboratory: Fiscal Year 2009. Washington, DC: The National Academies Press. doi: 10.17226/12768.
×
Page 6
Suggested Citation:"2 General Assessment of the Information Technology Laboratory." National Research Council. 2009. An Assessment of the National Institute of Standards and Technology Information Technology Laboratory: Fiscal Year 2009. Washington, DC: The National Academies Press. doi: 10.17226/12768.
×
Page 7
Suggested Citation:"2 General Assessment of the Information Technology Laboratory." National Research Council. 2009. An Assessment of the National Institute of Standards and Technology Information Technology Laboratory: Fiscal Year 2009. Washington, DC: The National Academies Press. doi: 10.17226/12768.
×
Page 8
Suggested Citation:"2 General Assessment of the Information Technology Laboratory." National Research Council. 2009. An Assessment of the National Institute of Standards and Technology Information Technology Laboratory: Fiscal Year 2009. Washington, DC: The National Academies Press. doi: 10.17226/12768.
×
Page 9
Suggested Citation:"2 General Assessment of the Information Technology Laboratory." National Research Council. 2009. An Assessment of the National Institute of Standards and Technology Information Technology Laboratory: Fiscal Year 2009. Washington, DC: The National Academies Press. doi: 10.17226/12768.
×
Page 10
Suggested Citation:"2 General Assessment of the Information Technology Laboratory." National Research Council. 2009. An Assessment of the National Institute of Standards and Technology Information Technology Laboratory: Fiscal Year 2009. Washington, DC: The National Academies Press. doi: 10.17226/12768.
×
Page 11

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

2 General Assessment of the Information Technology Laboratory The total FY 2009 projected available funds for the Information Technology Laboratory are $104.6 million, of which $40.0 million (38 percent) is other-agency funding, $63.9 million (61 percent) is scientific and technical research services (STRS) funding, and $0.7 million (1 percent) is other NIST funding. The total of all ITL staff is 332 staff members, of whom 37 percent are computer scientists, 21 percent are mathematicians/statisticians, and 14 percent each are information technology (IT) specialists, engineers/physicists, and administration and support personnel. Additional information on the funding for the 11 crosscutting key ITL programs is provided below. SOME IMPORTANT ACTIVITIES AT THE LABORATORY Many things are working very well at the Information Technology Laboratory. The ITL has established 11 key programs that cut across the traditional ITL divisions, and the ITL uses a matrix approach to manage them. The panel observed that the program structure, discussed below, has been adopted well and rapidly. This section discusses some important positive aspects of the overall program. Role as the “Honest Broker” Many ITL activities can be characterized as being a national or international resource. In many cases, ITL staff are the only such resource available. In others, they have established a role as the neutral party that is appropriately charged with even- handed measurement or evaluation of the quality of products. The challenge-problem activities mentioned below are important examples. For another example, ITL staff performed the empirical studies that allowed proposed standards for fingerprint and iris recognition to be confirmed as realistic. Challenge Problems There is a methodology for advancing technology that is especially appropriate in many of the activities for which the ITL is responsible, and it is both effective and pervasive in the ITL culture. A number of teams are responsible for issuing challenges to the research community in the form of task definitions, test data sets, and associated ground truth on which software is to be evaluated. The prototype program was the Text Retrieval Conference (TREC), at which the information-retrieval community has been given yearly challenges such as finding relevant documents or answering questions. More recent programs of this type include Text Retrieval Conference Video Retrieval Evaluation (TRECvid), Multiple Biometric Grand Challenge (MBGC; multimodal biometrics), and machine translation (MT). A similar program is Secure Hash Algorithm 5

3 (SHA3), a design challenge in secure hashing, in which ITL staff and community volunteers analyze the work of the participants. The international respect given to the ITL “challenge problems” allows the U.S. government to leverage a comparatively small investment into major technical progress on problems of immediate interest to many governmental agencies. Most of these challenge problems appear to be financed by agencies outside the Department of Commerce, with the ITL performing required research and providing test development and execution. With the ITL setting up carefully defined tests and distributing data sets, government agencies can entice competing academic and industrial organizations to focus research and development (R&D) monies on narrowly defined problems of immediate practical interest. In addition to providing a selection of benchmarks for quantifying progress in the field, as teams from around the world compete to be the best at the yearly tasks, the management of challenge problems presents serious research problems for the ITL staff itself, as they address the metrology involved. For example, there have been attempts to automate, or partially automate, the process of evaluating the quality of machine translation from language to language. PROGRAMS AND PROJECTS There has been very rapid progress in the Information Technology Laboratory with respect to putting programs in place. Following a matrix management approach, the divisions retain responsibility for administration and remain the focal point for a particular discipline, for example, statistics or security. Projects are intended to address particular problems or classes of problems that draw on several disciplines, both across the ITL and, in some cases, across NIST. The programs and their budgets (as of March 25, 2009, and broken out in terms of budget total and distribution by ITL headquarters (HQ) and ITL divisions1) are as follows: 1. Complex Systems (Total = $2,421,000: HQ = $329,000, MCSD = $1,019,000, ANTD = $498,000, IAD = $177,000, SSD = $289,000, SED = $109,000) 2. Cyber and Network Security (Total = $14,804,000: HQ = $330,000, MCSD = $26,000, ANTD = $2,679,000, CSD = $11,769,000) 3. Enabling Scientific Discovery (Total = $4,737,000: HQ = $327,000, MCSD = $2,890,000, CSD = $33,000, SED = $1,487,000) 4. Identity Management Systems (Total = $9,142,000: HQ = $318,000, ANTD = $34,000, CSD = $4,365,000, IAD = $4,424,000) 5. Information Discovery, Use, and Sharing (Total = $7,788,000: HQ = $461,000, MCSD = $939,000, ANTD = $125,000, IAD = $4,136,000, SSD = $1,534,000, SED = $593,000) 1 MCSD, Mathematical and Computational Sciences Division; ANTD, Advanced Network Technologies Division; IAD, Information Access Division; SSD, Software and Systems Division; SED, Statistical Engineering Division; CSD, Computer Security Division. 6

6. Pervasive Information Technology (Total = $1,833,000: HQ = $287,000, MCSD = $148,000, ANTD = $500,000, CSD = $90,000, IAD = $184,000, SSD = $624,000) 7. Trustworthy Information Systems (Total = $3,267,000: HQ = $329,000, MCSD = $176,000, CSD = $16,000, IAD = $212,000, SSD = $2,534,000) 8. Virtual Measurement Systems (Total = $1,030,000: HQ = $244,000, MCSD = $400,000, SED = $386,000) 9. Quantum Information (Total = $2,621,000: MCSD = $796,000, ANTD = $1,696,000, CSD = $129,000) 10. Voting (Total = $3,702,000: CSD = $1,103,000, IAD = $979,000, SSD = $1,620,000) 11. Healthcare Information Technology, or Health IT (Total = $1,246,000: HQ = $602,000, CSD = $49,000, SSD = $595,000). The ITL presented high-level introductions describing the scope of each of the 11 programs in existence at the time of the panel’s review. However, the panel’s ability to drill down to the technical activities of each program was limited to those projects that were covered in individual division-based reviews. At future reviews it would be useful to hear more detailed descriptions of the technical activities of each program, especially those aspects that by their nature cut across several divisions. In addition, the ITL should consider having external, independent reviews of programs periodically. One deficiency in some of the program presentations to the panel was a lack of clear identification of the focus or theme of the program. For some programs the name implies the focus (this is particularly true in some of the congressionally mandated programs), but other programs did not properly identify a national need being addressed and, as a result, appeared to be more of a collection of individual projects without any overarching national goal as a driver. It should be a specific responsibility of the program manager to ensure that this issue is addressed in all presentations and documentation provided to the panel in the future. Other programs appeared to be an uncomfortable collection of group activities with no explicable goals or metrics. The Pervasive Information Technologies Program lives up to its name in the sense of being a set of technologies, and each group is working on standards, but there is no discernible effort to have these technologies form something larger than its parts. Also, overall there did not seem to be a clear and cohesive roadmap for the activities subsumed under the Health Information Technology effort. By contrast, the Quantum Information Program seems well suited to matrix organization, and the NIST research expertise has salutary effects on a wide swathe of other activities, particularly in helping practitioners understand the possible impacts of quantum computing on cryptographic security. Role of Program Managers Program managers appear free to be advocates for their programs and do not seem burdened with routine administrative tasks. Their role in practice appears to be evolving and differs among programs. Program managers should be, at least to some extent, 7

advocates and marketers for the programs that they lead. In many cases it makes sense for them to look for external funding or for customers in the other laboratories of NIST for the projects associated with their programs. It is important for program managers to be recognized technical leaders in their fields; the more important the program, the more vital it is to have a strong and widely known advocate in a leadership role. For example—repeating the panel’s suggestion made 2 years ago—it would be helpful to bring in a strong scientist (expert in health informatics, maybe an M.D.) to lead the medical program efforts, especially given the recent thrust to develop electronic medical records. There are concerns among some of the staff that the creation of programs and program managers has increased the amount of “overhead” and reduced the number of people actually doing the work. If the program managers are capable of providing technical leadership and also devote effort to promoting the interests of program participants, then they should be regarded as positive contributors, even if they are no longer writing code or doing other technical tasks associated with individual projects. The group of program managers, being mostly younger or midcareer people, will eventually provide a valuable cadre of experienced managers within the ITL. Sunrise and Sunset of Projects Although the programs are too new to have provided much opportunity to deal with project evolution, it would be wise to think about the process and philosophy for both terminating projects and generating new ones. Projects that support a particular standard can risk continuing and consuming ITL resources forever if no external body is willing to take ownership of the standard. In general, there should be a process in place for doing a critical review of projects periodically to make sure that they are fulfilling a need and functioning well. Likewise, there should be a process in place for creating new projects. Many of these will be mandated by Congress or will arise from a need within another laboratory of NIST itself. However, it would be stimulating to encourage bottom-up proposals. This approach may interact in important ways with the matter of external (outside agency [OA]) funding, discussed below. The philosophy in both the process for terminating projects and the process for creating new projects should be to base evaluations and decisions on what efforts are most appropriate for the ITL within NIST’s mission. BUDGET ISSUES There are large, temporary changes in budget levels, overlaying the normal up- and-down progression of funding cycles. At the ITL, budgeting is having more influence than it should on the progress of the technical work. One issue is the role of “soft money”—the resources that come from outside NIST. A second issue is that in some places within the ITL it has not been possible to hire critical staff. The third major issue is how the arrival of the recently announced temporary funding can be used to benefit the technical goals of the laboratory rather than to result in a burden on already-overloaded staff. Each of these issues is discussed below. 8

Soft Money The ITL divisions vary greatly in how much OA money they rely on. The Information Access Division (IAD) is the only division that receives the majority of its budget from outside NIST. In addition, the Statistical Engineering Division (SED) and the Mathematical and Computational Sciences Division (MCSD) receive significant amounts of money from other laboratories at NIST. Taking money to perform tasks for others could be harmful. It could lead to the ITL’s personnel devoting effort to problems that are not worthwhile science just because there is money available to support staff. In extreme cases, the role of NIST as an impartial broker could be jeopardized. The panel has not, however, seen these problems in practice. Some of the externally funded work is among the most interesting and impactful in the ITL. The existence of staff supported by soft money can lead to a certain paralysis and stagnation. For example, some research staff in the IAD expressed a reluctance to invest their time until the outside money was in hand (something that does not always happen on the most desirable schedule). There may also be an inhibition to seeking outside sources for good projects. The ITL leadership has made it clear that well-conceived projects with outside funding do not present a risk to the research staff; the laboratory will backstop research personnel against unforeseen loss of OA money. However, staff are fearful and risk-averse, especially in the current economy. The use of soft money has benefits and potential risks. As long as potential risks are monitored and avoided, a policy of encouraging a search for solid external support for sound, internally vetted projects is worthwhile and likely to lead to the direct funding of new, important, and relevant projects. Hiring There is a similar problem concerning the filling of staff positions: researchers are unwilling to expend the effort to search for a new person without a clear mandate to do so. With the new crosscutting structure, some staff are unclear about who is responsible for conducting a search. And of course busy people are not anxious to devote a lot of time to an effort that can come to naught for a variety of reasons. The situation in the Statistical Engineering Division is of significant concern. The dynamic division head who was hired about 2.5 years ago has developed good support and rapport with his staff and ITL management. However, the problem noted by the panel 2 years ago remains: there have been no other new hires at SED for at least 7 years, and a number of new and important branches of the field and emerging scientific needs are not covered by permanent staff. This issue should be creatively and energetically addressed now. On a positive note, the ITL was able to deal with one of the problems of 2 years ago: the inability to hire scientists in nontraditional fields at appropriate salaries. A linguist was hired for work on the machine-translation challenge problems. The process of such hires should be further regularized, because there is likely to be a continued need 9

for additional personnel in soft sciences for such projects as accessibility of voting machines. Short-Term Money The ITL noted its responsibility for spending a large part of the money that has been allocated for economic recovery (through the American Recovery and Reinvestment Act of 2009 [Public Law 111-5]), especially in the area of electronic medical records, but in many other subdisciplines as well. It would be ideal if the fellowships and short-term arrangements could be used to develop longer-term collaborations—ones that could last beyond the money and perhaps lead to joint ventures in better times. Yet there is a concern, both from the staff and the panel, that the net result will be that the scientists do a lot of work functioning as program managers and will get nothing out of it to support their own research activities. The staff, perhaps led by the program managers, should start looking for those external groups, such as at research universities, where there is expertise that could fit well with the programs. They should be proactive in making known both the opportunity and the needs of their teams. RAISING THE PROFILE OF THE LABORATORY While the various ITL units differ in their approach to external visibility, performance evaluation should stress some of the points made below, which include a few principles that could be applied to benefit ITL scientists and improve their ability to fulfill their mission. Outreach The ITL should try to build connections with major research universities and research laboratories. The existence of short-term funding support should help initiate these connections. Soft money should be used to search for and invite more faculty, postdoctoral researchers, and other short-term visitors. The ITL should encourage staff to participate in professional service, such as service on conference committees (tasks other than program committee work often lead to recognition and membership on later program committees) or participation in speakers’ bureaus. Publication Many ITL scientists have extensive publication lists; for those who publish regularly, it is important to be selective in their venue. Publication is far from the only way to impact science. In many areas of computer science (although not mathematics or statistics), conferences are more respected than journals are, because top conferences are harder to get into than are even the best journals. The top few conferences in a field will often have acceptance rates of around 10 percent. The ITL should give more attention to 10

the quality of publication venues and to reward staff who can publish their work in the places with the greatest visibility and prestige. Journal and conference papers serve not only to disseminate project results, but also to advertise the capabilities and successes of a program and to provide a permanent, searchable record for future users. 11

Next: 3 Assessment of the Laboratory Divisions »
An Assessment of the National Institute of Standards and Technology Information Technology Laboratory: Fiscal Year 2009 Get This Book
×
 An Assessment of the National Institute of Standards and Technology Information Technology Laboratory: Fiscal Year 2009
Buy Paperback | $21.00
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

An Assessment of the National Institute of Standards and Technology Information Technology Laboratory evaluates The Information Technology Laboratory (ITL) of the National Institute of Standards and Technology (NIST). Six divisions of the laboratory were visited and reviewed. The scope of the assessment includes the following criteria: (1) the technical merit of the current laboratory programs relative to current state-of-the-art programs worldwide; (2) the adequacy of the laboratory budget, facilities, equipment, and human resources, as they affect the quality of the laboratory's technical programs; and (3) the degree to which the laboratory programs in measurement science and standards achieve their stated objectives and desired impact. Based on the assessment, and using these criteria, the book outlines several observations and recommendations for ITL.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!