National Academies Press: OpenBook
« Previous: Front Matter
Suggested Citation:"1 Introduction." National Research Council. 2005. 2003-2004 Assessment of the Army Research Laboratory. Washington, DC: The National Academies Press. doi: 10.17226/18595.
×
Page 1
Suggested Citation:"1 Introduction." National Research Council. 2005. 2003-2004 Assessment of the Army Research Laboratory. Washington, DC: The National Academies Press. doi: 10.17226/18595.
×
Page 2
Suggested Citation:"1 Introduction." National Research Council. 2005. 2003-2004 Assessment of the Army Research Laboratory. Washington, DC: The National Academies Press. doi: 10.17226/18595.
×
Page 3
Suggested Citation:"1 Introduction." National Research Council. 2005. 2003-2004 Assessment of the Army Research Laboratory. Washington, DC: The National Academies Press. doi: 10.17226/18595.
×
Page 4
Suggested Citation:"1 Introduction." National Research Council. 2005. 2003-2004 Assessment of the Army Research Laboratory. Washington, DC: The National Academies Press. doi: 10.17226/18595.
×
Page 5
Suggested Citation:"1 Introduction." National Research Council. 2005. 2003-2004 Assessment of the Army Research Laboratory. Washington, DC: The National Academies Press. doi: 10.17226/18595.
×
Page 6
Suggested Citation:"1 Introduction." National Research Council. 2005. 2003-2004 Assessment of the Army Research Laboratory. Washington, DC: The National Academies Press. doi: 10.17226/18595.
×
Page 7
Suggested Citation:"1 Introduction." National Research Council. 2005. 2003-2004 Assessment of the Army Research Laboratory. Washington, DC: The National Academies Press. doi: 10.17226/18595.
×
Page 8

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

1 Introduction THE BIENNIAL ASSESSMENT PROCESS The charge of the Army Research Laboratory Technical Assessment Board (ARLTAB) is to provide biennial assessments of the scientific and technical quality of the Army Research Laboratory (ARL). These assessments include the development of findings and recommendations related to the quality of ARL’s research, development, and analysis programs. While the primary role of the Board is to provide peer assessment, it may also offer advice on related matters when requested to do so by the ARL Director; the advice provided focuses on technical rather than programmatic considerations. The Board is assisted by standing National Research Council (NRC) panels that focus on particular portions of the ARL program. The Board’s assessments are commissioned by ARL rather than by one of ARL’s parent organizations. The Army Research Laboratory Technical Assessment Board currently consists of 8 leading scien- tists and engineers whose experience collectively spans the major topics within the scope of ARL. Six panels, one for each of ARL’s in-house directorates,1 report to the Board. Each Board member sits on a panel, 6 of them as panel chairs. The panels range in size from 9 to 20 members, whose expertise is tailored to the technical fields covered by the directorate(s) that they review. In total, 82 experts participated, without compensation, in the process that led to this report. The Board and panels are appointed by the National Research Council with an eye to assembling 1The six ARL directorates are the Computational and Information Sciences Directorate (CISD), Human Research and Engineering Directorate (HRED), Sensors and Electron Devices Directorate (SEDD), Survivability and Lethality Analysis Directorate (SLAD), Vehicle Technology Directorate (VTD), and Weapons and Materials Research Directorate (WMRD) (see Appendix A, which contains an ARL organizational chart as well as a tabulation of ARL funding by technical unit). The Board does not have a panel specifically devoted to the Army Research Office (ARO), which is another unit of ARL, but all Board panels examine how well ARO and ARL’s in-house research and development are coordinated. 1

2 2003–2004 ASSESSMENT OF THE ARMY RESEARCH LABORATORY balanced slates of experts without conflicts of interest and with balanced perspectives. The 82 experts include current and former executives and research staff from industrial research and development (R&D) laboratories, leading academic researchers, and staff from Department of Energy (DOE) na- tional laboratories and federally funded R&D centers. Fifteen of them are members of the National Academy of Engineering (NAE), a number have been leaders in relevant professional societies, and several are current or past members of organizations such as the Army Science Board, the Air Force Scientific Advisory Board, the Air Force Weapons Laboratory, and the Defense Advanced Research Projects Agency (DARPA). The Board and its panels are supported by National Research Council staff, who interact with ARL on a continuing basis to ensure that the Board and panels receive the information they need to carry out their assessments. Board and panel members serve for finite terms, generally of 4 years, staggered so that there is regular turnover and a refreshing of viewpoints. Biographical information on the Board and panel members appears in Appendix B, along with a chart listing the Board membership and the name of each panel, its membership, and the name of the ARL directorate that it reviews. Preparation and Organization of This Report The current report is the third biennial report of the Board. Its first biennial report appeared in 2000, and annual reviews by the Board appeared in 1996, 1997, and 1998. As with the earlier reviews, this report contains the Board’s judgments about the quality of ARL’s work (Chapters 2 through 7 focus on the individual directorates). The rest of this chapter explains the rich set of interactions that support those judgments. The amount of information that is funneled to the Board, including the consensus evaluations of the recognized experts who make up the Board’s panels, provides a solid foundation for a thorough peer review. This review is based on a large amount of information received from ARL and on panel interactions with ARL staff. Most of the information exchange occurs during the annual meetings convened by each panel at the appropriate ARL sites. In both scheduled meetings and less formal interactions, ARL evinces a very healthy level of exchange and acceptance of external comments. The assessment panels engaged in many constructive interactions during their annual site visits in 2003 and 2004. In addition, useful collegial exchanges have taken place between panel members and individual ARL investigators outside of meetings as ARL staff members seek additional clarification about panel comments or questions and take advantage of panel members’ contacts and sources of information. Agendas for the 2003 and 2004 meetings of the panels are presented in Appendix C. Panel meetings last for 2 or 21/2 days, during which time the panel members receive a combination of overview briefings by ARL management and technical briefings by ARL staff. Prior to the meetings, some panels receive extensive materials for review, including staff publications. The overview briefings bring the panels up to date on the Army’s long-range planning. This context- building step is needed because the panels are purposely composed mostly of people who—while experts in the technical fields covered by the directorates(s) they review—are not engaged in work focused on Army matters. Technical briefings for the panels focus on the research and development goals, strategies, methodologies, and results of selected projects at the laboratory. Briefings are targeted toward coverage of a representative sample of each directorate’s work over the 2-year assessment cycle. Ample time during both overview and technical briefings is devoted to discussion, both to clarify a panel’s understanding and to convey the observations and understandings of individual panel members

INTRODUCTION 3 to ARL’s scientists and engineers (S&Es). The panels also devote sufficient time to closed-session deliberations, during which they develop consensus findings and identify important issues or gaps in the panel’s understanding. Those issues or gaps are discussed during follow-up sessions with ARL staff so that the panel is confident of the accuracy and completeness of its assessments. Panel members continue to refine their findings, conclusions, and recommendations during written exchanges and teleconfer- ences after the meetings. When necessary, the panels receive presentations that are classified at the Department of Defense (DOD) “Secret” level. This report does not contain classified information. In addition to the insights gained from the panel meetings, Board members receive exposure to ARL and its staff at Board meetings each winter. Also, some panel members attend the annual planning meetings for ARL’s Sensors and Electron Devices Directorate (SEDD) and Weapons and Materials Research Directorate (WMRD), at which those directorates discuss their programs with the directorates’ customers. In addition, several Board members attended the 2003 and 2004 symposia that highlight progress among ARL’s Collaborative Technology Alliances (CTAs). As previously noted, each panel normally reviews the work of a single ARL directorate. In 2004, at the request of the ARL Director, the Board undertook two additional reviews, forming two teams that assessed the activities within ARL that cut across several of its directorates in the areas of nano- technology and robotics. The membership of each of the two review teams was drawn from several panels and was tailored to the areas of expertise required to address the crosscutting nanotechnology and robotics activities. These reviews are intended to help ARL identify interactions, interdependencies, and opportunities for synergy across its directorates as well as the state of the art of its nanotechnology and robotics R&D. Chapters 8 and 9 in this report summarize the results of those crosscutting reviews. Assessment Criteria The Board and panels applied assessment criteria organized by six categories (Appendix D presents the complete set of assessment criteria): 1. Effectiveness of interaction with the scientific and technical community: criteria that indicate cognizance of and contribution to the scientific and technical community whose activities are relevant to the work performed at ARL. 2. Impact on customers: criteria that indicate cognizance of and contribution in response to the needs of the Army customers who fund and benefit from ARL R&D. 3. Formulation of projects’ goals and plans: criteria that indicate the extent to which projects address ARL strategic goals and are planned effectively to achieve stated objectives. 4. R&D methodology: criteria that indicate the appropriateness of the hypotheses that drive the research, of the tools and methods applied to the collection and analysis of data, and of the judgments about future directions of the research. 5. Capabilities and resources: criteria that indicate whether current and projected equipment, facilities, and human resources are appropriate to achieve success of the projects. 6. Responsiveness to the Board’s recommendations: The Board does not consider itself to be an oversight committee. The Board has consistently found ARL to be extremely responsive to its advice, and so the criterion of responsiveness encourages discussion of the variables and con- textual factors that affect ARL’s implementation of responses to recommendations, rather than an accounting of responses to the Board’s recommendations.

4 2003–2004 ASSESSMENT OF THE ARMY RESEARCH LABORATORY Completion of the Report In July 2004, the Board met for 2 days to share members’ summaries of their panels’ observations and concerns. This report represents the Board’s consensus findings and recommendations. The Board’s aim with this report is to provide guidance to the ARL Director that will help ARL sustain its process of continuous improvement. To that end, the Board examined its extensive and detailed notes from the many Board, panel, and individual interactions with ARL over the 2003-2004 period. From those notes it distilled a short list of the main trends, opportunities, and challenges that merit attention at the level of the ARL Director. The Board used that list as the basis for this report. Specific ARL projects are used to illustrate these points in the following chapters when it is helpful to do so, but the Board did not aim to present the Director with a detailed account of 2 years’ worth of interactions with bench scientists. The draft of this report was subsequently honed and reviewed according to NRC procedures before being released. ARMY RESEARCH LABORATORY SUPPORT FOR WAR-RELATED OPERATIONS This is an extraordinary time for the Army Research Laboratory and indeed for the country as a whole. Examining ARL’s support to current war efforts was not a direct charge to the Board, but it became apparent that in addition to performing its typical R&D functions for the Army, ARL has responded quickly and effectively, applying its multiple talents to addressing serious problems as they have arisen in war-related operations in Iraq and Afghanistan. ARL is to be commended for its dedicated and skilled efforts, which have saved warfighter lives and equipment and enhanced the capabilities of U.S. forces. Contributions have come from across ARL. ARL’s current contributions in this arena reflect its unique role as the link within the Army between scientific and technical expertise and specific Army applications—a role that requires maintaining its preparation to contribute when problems arise in the field. This link and consistent contributions have existed across previous war efforts; only the contributions to the current war efforts are discussed here. The Computational and Information Sciences Directorate (CISD) currently has one soldier in Iraq, and an Army Reservist working for the directorate has returned from a 1-year deployment. CISD has supported and continues to support ongoing operations in Afghanistan, Iraq, and other places through its support of multiple programs, including the following: • PacBots. These small robots have been deployed to Afghanistan to help clear bunkers, ammu- nition caches, caves, buildings, and walled compounds; • Phraselator. This device provides one-way language translation support in a ruggedized per- sonal digital assistant (PDA); • Forward Area Language Converter (FALCon). CISD conducts multilingual research to provide tools for translating documents found in the theater; • Document Exploitation Suite (DOCEX). This high-speed, adaptable capability aids in identify- ing, prioritizing, translating, and managing foreign language materials by automating the han- dling of foreign documents and media; • Integrated Meteorological System (IMETS). The Army’s weather information management system and the weather component for Intelligence Preparation of the Battlefield, IMETS is intended to provide commanders with automated weather observations, forecasts, battlefield visualization, and weather effects decision aids; and • Acoustic Battlefield Aid. This tactical decision aid uses acoustic sound propagation to identify

INTRODUCTION 5 areas in which U.S. military assets can be seen and/or heard or not seen and/or not heard. ARL conducts research for the development and evaluation of acoustic propagation models for use in the long-range detection of infrasonic signals (<10 Hz). (Infrasound monitoring examines sig- natures from human-made and naturally occurring infrasonic sources and the environmental impact on the signals.) ARL also installed an infrasound array and supported data collection and analyses in Korea. Two employees of the Human Research and Engineering Directorate (HRED) supported Operation Enduring Freedom as part of the Army Materiel Command Logistical Support Element in Southwest Asia during fiscal year (FY) 2003-2004. Also, in collaboration with Carnegie Mellon University and with support from the program manager for Close Combat Systems, HRED significantly improved the probability of detecting low-metal land mines—to greater than 98 percent—using the Army’s newly employed AN/PSS-14 Handheld Standoff Mine Detector System by applying MANPRINT (Manpower and Personnel Integration) throughout the acquisition cycle. The Sensors and Electron Devices Directorate (SEDD) has provided active support to war-related operations. For SEDD, three individuals are or were in the field, and 38 individuals are supporting theater (e.g., Iraq, Afghanistan) projects. SEDD efforts include the following: • Acoustic localization for sniper and mortar detection. The effort in this area led to the fielding of systems in 45 days; • Acoustic database. Data are provided for use with acoustic microsensors; • Booby trap detection. Work in this area led to improved improvised explosive device (IED) detection; • HMMWV (High-Mobility Multipurpose Wheeled Vehicle, or “Humvee”) gun mount; • Infrasonic arrays (described above: see item on “Acoustic Battlefield Aid” under CISD efforts); and • Initial work on disposable sensing concepts. This work is focused on sensors for use while clearing buildings during military operations in urban terrain. During FY 2003-2004, the Survivability and Lethality Analysis Directorate (SLAD) had four individuals in the field and nine supporting theater projects. SLAD anticipated three individuals in theater starting in September 2004. SLAD efforts include the following: • Improvised explosive device countermeasure equipment. This IED countermeasure equipment is expected to be in theater by September 2004, with approximately 500 of the countermeasure devices in theater by the end of November 2004; the U.S. Air Force and other government agencies have also ordered countermeasure devices; • Information Assurance Network Assessment. This assessment will cover the three network architectures currently deployed in theater; and • Survivability link between SLAD and deployed units. This link is being established on the Secret Internet Protocol Router Network (SIPRnet). The Vehicle Technology Directorate (VTD) has not been directly connected to any current projects in either the Iraq or Afghanistan theaters, but it does consult for Fort Eustis, Virginia, which provides the first line of fleet support in aviation. Moreover, its Active Stall Control Engine Demonstration Program clearly is aimed at a serious problem endemic to those theaters.

6 2003–2004 ASSESSMENT OF THE ARMY RESEARCH LABORATORY The Weapons and Materials Research Directorate’s (WMRD’s) Terminal Effects Division has had 3 personnel in theater, and approximately 45 WMRD scientists and technicians have been involved in supporting war-related activities. The types of support that they provide and have provided include the following: • Design of the rear protection shield for Abrams tanks; • Armor survivability kit for HMMWVs. ARL built the first 40 of these kits, then transitioned the work to the Tactical Command, which has built more than 10,000 based on ARL’s design; • Battle damage assessment. This capability provides technical information on kills to combat vehicles; and • Field expedient armor solutions. This capability provides technical information on ways to improve the survivability of tactical and ground combat vehicles. CROSSCUTTING ISSUES In addition to performing special examinations of crosscutting activities in nanotechnology and robotics, described in the last two chapters of this report, the Board also identified three crosscutting issues that are discussed in more detail in Chapters 2 through 7, which summarize the assessments of the directorates. As described below, these issues are modeling and simulation, information assurance and security, and interdirectorate activities. Modeling and Simulation The appropriate use of modeling and simulation could be a unifying capability that would have broad implications across many of the ARL directorates. Currently, however, there continue to be key issues relating to modeling and experiments that researchers and investigators have not properly consid- ered. They include the following: • Verification (i.e., that a computer program does what it was intended to do); • Validation (i.e., that the computer program produces results that are valid in and relevant to the domain in which it was intended to operate); • Use of a variety of standard operating practices for doing calculations and presenting results (e.g., the use of appropriate dimensionless variables); and • The consequences of the widespread replacement of experiments by computational modeling. Addressing these aspects of modeling and simulation could be enhanced by an effort to instruct and support the scientists in the application and science of modeling and simulation techniques. The goal would be to build on top of commercial software whenever possible, but to extend and enhance the modeling and simulation capability to address the particular needs of the various directorates. This support for a modeling and simulation capability, coupled with the enormous computational capability of ARL, could be a unique capability that would enhance the research being performed in a wide number of activities. ARL is not self-sufficient when it comes to providing its scientists with simulation and modeling capabilities. Rather, it relies on codes developed by other entities. Examples include CTH (Sandia National Laboratories) for explosion dynamics and large-deformation solid mechanics; MM5 (National Center for Atmospheric Research/Pennsylvania State University) for mesoscale atmospheric modeling;

INTRODUCTION 7 and a variety of National Aeronautics and Space Administration aerodynamics codes. While this is a cost-effective way of providing capability for ARL in many respects, it requires a commitment to acquiring a deep understanding of such codes in-house. This same understanding allows ARL staff to engage the developers seriously in adding capability, or possibly to add the capability themselves. Failing to add needed capability in one way or another can lead to ad hoc efforts to work around gaps in the extramural code, rather than adding permanent new capability to that code. Examples of such projects include the now-casting atmospheric modeling effort within CISD and the munitions modeling code within WMRD. While these are both competently executed efforts, each was based on 20-year-old algorithm technology. Furthermore, since they are based on old technology and are not being incorpo- rated into the mainline production codes, they are unlikely to persist as useful artifacts. Information Assurance and Security Information assurance and security are matters of concern across ARL directorates. Both CISD and SLAD, for example, have important roles in this area. The Board believes that CISD should be taking a larger role within ARL and that insufficient resources (staffing), rather than a lack of interest, are preventing the directorate from fulfilling this role. Information security issues arise internally at ARL as well, in that a significant amount of code being used comes from outside ARL from a variety of sources but there is no clear approach to verifying that it is free from potentially damaging capabilities. A comprehensive strategy should be pursued to assure that the ARL information technology infrastructure cannot be compromised. Interdirectorate Activities One of the strengths of ARL is the breadth of activities underway. This makes it possible, if the appropriate leadership and incentives are put in place, to create cross-directorate programs that would exploit the somewhat unique capabilities of ARL. The crosscutting nanotechnology efforts reviewed for this report show potential for going in this direction. While the work reviewed here was a series of relatively independent efforts, it became clear during the review that many opportunities exist for collaborating and developing common infrastructure. Other opportunities for such collaboration are discussed in the individual directorate reports. The process for undertaking this type of collaborative activities is essentially undefined, however, and without sufficient structural facilitations to encourage it. Crosscutting activities such as those discussed here should be strongly encouraged. Doing so will require significant changes in ARL’s approach to program management. The Board commends the ARL Director for initiating the crosscutting reviews in the areas of nanotechnology and robotics—these constitute important steps toward identifying programmatic needs and opportunities in these areas and toward learning lessons that can be applied to other crosscutting activities.

Next: 2 Computational and Information Sciences Directorate »
2003-2004 Assessment of the Army Research Laboratory Get This Book
×
 2003-2004 Assessment of the Army Research Laboratory
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!