National Academies Press: OpenBook

Review of NASA's Aerospace Technology Enterprise: An Assessment of NASA's Pioneering Revolutionary Technology Program (2003)

Chapter: 3. Report of the Panel on Computing, Information, and Communications Technology

« Previous: 2. Overall Assessment of the Pioneering Revolutionary Technology Program
Suggested Citation:"3. Report of the Panel on Computing, Information, and Communications Technology." National Research Council. 2003. Review of NASA's Aerospace Technology Enterprise: An Assessment of NASA's Pioneering Revolutionary Technology Program. Washington, DC: The National Academies Press. doi: 10.17226/10810.
×
Page 20
Suggested Citation:"3. Report of the Panel on Computing, Information, and Communications Technology." National Research Council. 2003. Review of NASA's Aerospace Technology Enterprise: An Assessment of NASA's Pioneering Revolutionary Technology Program. Washington, DC: The National Academies Press. doi: 10.17226/10810.
×
Page 21
Suggested Citation:"3. Report of the Panel on Computing, Information, and Communications Technology." National Research Council. 2003. Review of NASA's Aerospace Technology Enterprise: An Assessment of NASA's Pioneering Revolutionary Technology Program. Washington, DC: The National Academies Press. doi: 10.17226/10810.
×
Page 22
Suggested Citation:"3. Report of the Panel on Computing, Information, and Communications Technology." National Research Council. 2003. Review of NASA's Aerospace Technology Enterprise: An Assessment of NASA's Pioneering Revolutionary Technology Program. Washington, DC: The National Academies Press. doi: 10.17226/10810.
×
Page 23
Suggested Citation:"3. Report of the Panel on Computing, Information, and Communications Technology." National Research Council. 2003. Review of NASA's Aerospace Technology Enterprise: An Assessment of NASA's Pioneering Revolutionary Technology Program. Washington, DC: The National Academies Press. doi: 10.17226/10810.
×
Page 24
Suggested Citation:"3. Report of the Panel on Computing, Information, and Communications Technology." National Research Council. 2003. Review of NASA's Aerospace Technology Enterprise: An Assessment of NASA's Pioneering Revolutionary Technology Program. Washington, DC: The National Academies Press. doi: 10.17226/10810.
×
Page 25
Suggested Citation:"3. Report of the Panel on Computing, Information, and Communications Technology." National Research Council. 2003. Review of NASA's Aerospace Technology Enterprise: An Assessment of NASA's Pioneering Revolutionary Technology Program. Washington, DC: The National Academies Press. doi: 10.17226/10810.
×
Page 26
Suggested Citation:"3. Report of the Panel on Computing, Information, and Communications Technology." National Research Council. 2003. Review of NASA's Aerospace Technology Enterprise: An Assessment of NASA's Pioneering Revolutionary Technology Program. Washington, DC: The National Academies Press. doi: 10.17226/10810.
×
Page 27
Suggested Citation:"3. Report of the Panel on Computing, Information, and Communications Technology." National Research Council. 2003. Review of NASA's Aerospace Technology Enterprise: An Assessment of NASA's Pioneering Revolutionary Technology Program. Washington, DC: The National Academies Press. doi: 10.17226/10810.
×
Page 28
Suggested Citation:"3. Report of the Panel on Computing, Information, and Communications Technology." National Research Council. 2003. Review of NASA's Aerospace Technology Enterprise: An Assessment of NASA's Pioneering Revolutionary Technology Program. Washington, DC: The National Academies Press. doi: 10.17226/10810.
×
Page 29
Suggested Citation:"3. Report of the Panel on Computing, Information, and Communications Technology." National Research Council. 2003. Review of NASA's Aerospace Technology Enterprise: An Assessment of NASA's Pioneering Revolutionary Technology Program. Washington, DC: The National Academies Press. doi: 10.17226/10810.
×
Page 30
Suggested Citation:"3. Report of the Panel on Computing, Information, and Communications Technology." National Research Council. 2003. Review of NASA's Aerospace Technology Enterprise: An Assessment of NASA's Pioneering Revolutionary Technology Program. Washington, DC: The National Academies Press. doi: 10.17226/10810.
×
Page 31
Suggested Citation:"3. Report of the Panel on Computing, Information, and Communications Technology." National Research Council. 2003. Review of NASA's Aerospace Technology Enterprise: An Assessment of NASA's Pioneering Revolutionary Technology Program. Washington, DC: The National Academies Press. doi: 10.17226/10810.
×
Page 32
Suggested Citation:"3. Report of the Panel on Computing, Information, and Communications Technology." National Research Council. 2003. Review of NASA's Aerospace Technology Enterprise: An Assessment of NASA's Pioneering Revolutionary Technology Program. Washington, DC: The National Academies Press. doi: 10.17226/10810.
×
Page 33
Suggested Citation:"3. Report of the Panel on Computing, Information, and Communications Technology." National Research Council. 2003. Review of NASA's Aerospace Technology Enterprise: An Assessment of NASA's Pioneering Revolutionary Technology Program. Washington, DC: The National Academies Press. doi: 10.17226/10810.
×
Page 34
Suggested Citation:"3. Report of the Panel on Computing, Information, and Communications Technology." National Research Council. 2003. Review of NASA's Aerospace Technology Enterprise: An Assessment of NASA's Pioneering Revolutionary Technology Program. Washington, DC: The National Academies Press. doi: 10.17226/10810.
×
Page 35
Suggested Citation:"3. Report of the Panel on Computing, Information, and Communications Technology." National Research Council. 2003. Review of NASA's Aerospace Technology Enterprise: An Assessment of NASA's Pioneering Revolutionary Technology Program. Washington, DC: The National Academies Press. doi: 10.17226/10810.
×
Page 36
Suggested Citation:"3. Report of the Panel on Computing, Information, and Communications Technology." National Research Council. 2003. Review of NASA's Aerospace Technology Enterprise: An Assessment of NASA's Pioneering Revolutionary Technology Program. Washington, DC: The National Academies Press. doi: 10.17226/10810.
×
Page 37

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Report of the Panel on Computing, Information, and Communications Technology INTRODUCTION The Computing, Information, and Communica- tions Technology (CICT) program is one of three pro- grams under NASA's Pioneering Revolutionary Tech- nology (PRT) program. The CICT program in turn comprises four broad, level 2 projects (see Table 3-1~: Space Communications (SC) project Intelligent Systems (IS) project Information Technology Strategic Research (ITSR) project Computing, Networking, and Information Sys- tems (CNIS) project Each project is divided into level 3 elements, and those elements into tasks. The CICT program, funded at $138 million for FY2002, comprises 242 individual research tasks. The goal of the CICT program is to "enable NASA's scientific research, space exploration, and aerospace technology missions with greater mission assurance, for less cost, with increased science return through the development and use of advanced comput- ing, information and communications technologies" (Tu, 2002~. The CICT program plans to accomplish this goal by . TABLE 3-1 Computing, Information, and Communications Technology (CICT) Program Organization and Budget, FY2002-2003 Budget (million $) FY2002 FY2003 CICT program, total Projects 137.5 153.3 Computing, Networking, and Information Systems (CNIS) 42.7 40.9 Intelligent Systems (IS) Information Technology Strategic Research (ITSR) 59.3 75.9 28.4 29.0 Space Communications (SC) 7.1 7.5 SOURCE: Tu (2002) and Andrucyk (2003~. puter systems where the tools are more adap- tive and computers can work collaboratively with humans, · Enabling seamless access to NASA informa- tion technology in all locations, including space, Enabling high-rate data delivery that provides continuous presence in all locations that NASA operates, and . Creating goal-directed, human-centered com- 20

PANEL ON COMPUTING, INFORMATION, AND COMMUNICATIONS TECHNOLOGY . Developing a broad portfolio of information technologies and bio- and nanotechnologies that have the potential to revolutionize future NASA missions (Tu, 2002~. REVIEW PROCESS The National Research Council's Panel on Com- puting, Information, and Communications Technology (referred to as the CICT panel in this report) conducted its review in two phases. (Biographies of the panelists may be found in Appendix B.) The first phase was to gain an understanding of the top-level objectives of NASA's Computing Information and Communications Technology (CICT) program as the program relates to overall NASA needs. This phase was completed at the first meeting of the CICT panel, June 10-13, 2002, at NASA Ames Research Center at Mountainview, Cali- fornia. The second phase of the review was aimed at understanding the quality and technical merits of indi- vidual tasks being conducted under the auspices of the CICT program. To accomplish this task-level evalua- tion, the panel gave CICT management a one-page questionnaire which the management distributed to some 242 task managers and principal investigators (PIs). A copy of the questionnaire can be found in Ap- pendix E. The CICT panel then evaluated the individual tasks by referring to the questionnaires, conducting fol- low-up site visits, reviewing technical publications, and talking directly to PIs as needed. Subpanels of the CICT review panel visited three sites: Ames Research Center in California (June 13, 2002, and April 14, 2003), Jet Propulsion Lab (JPL) in California (July 2, 2002), and Glenn Research Center in Ohio (July 24, 2002~. This report discusses top-level issues that are rel- evant to the entire CICT program in the next section, "Overall Observations." Other sections discuss the re- search portfolio of the CICT program, the quality of CICT research plans and overall methodology, how well the CICT program has connected with the com- munity outside NASA, and the quality of the technical staff and facilities at the NASA CICT facilities visited by the CICT panel. Specific tasks are highlighted throughout the report as illustrative examples. 21 OVERALL OBSERVATIONS ON THE CICT PROGRAM During the review process, the CICT panel placed each task into one of three broad categories: World-class, Good work focused on the NASA mission, and Work that is complete or that should be dis- continued. The great majority of the work reviewed by the CICT panel was good, NASA-focused research. Re- search categorized as excellent by the CICT panel was work that was typically state of the art and at the same time directly focused on the NASA mission. Such re- search showed high productivity in terms of published papers, delivered hardware and software, and public presentation. World-class work appeared to address a specific customer or set of customers, regardless of the task's technological maturity. If a task is not mentioned at all in this report, the CICT panel has deemed that the effort was good work focused on the NASA mission. Such work should con- tinue in the current CICT program plan. This work demonstrated that the researchers had generally well- defined hypotheses, directions, and products to build. While not state of the art, the work was good and fo- cused enough for its undisturbed continuation. There were two general criteria for work that was complete or should be discontinued. First, work being conducted by CICT that was primarily service-oriented was called into question by the CICT panel. There are several instances discussed in this report where tasks produced useful products and should be transitioned out of the research budget and into NASA operations and their separate funding lines. Second, research tasks that the CICT panel recom- mended for discontinuation are efforts that the panel believes do not contribute to the NASA mission and therefore are not appropriate for NASA to continue. This type of research typically showed little in the way of productivity, few or no papers published, little or no software developed, generally few or no public presen- tations, and little or no direct applicability to a NASA mission. Quite often such low productivity efforts had high full-time-equivalent (FTE) values. The CICT panel was concerned that this situation indicated a significant amount of effort was being put into the task with little return. In general, the support of work that

22 AN ASSESSMENT OF NASA 'S PIONEERING REVOLUTIONARY TECHNOLOGY PROGRAM did not appear to map well to NASA missions, or was In summary, ITSR had five world-class tasks. SC duplicative of efforts being carried out external to had eight world-class tasks, and IS had four world-class NASA, appeared to be unwise and unnecessary. The tasks. One project, Computing, Networking, and Infor- CICT panel looked at these tasks carefully to help mation Systems (CNIS), had no world-class tasks. The NASA assess whether a critical mass of research was CICT panel also identified nine tasks that were com- being carried out. plete and should be moved out from under CICT, or The CICT panel highlighted the 17 out of 242 tasks that were of questionable value to NASA's core mis- that are examples of world-class work: sion and should be discontinued: . Intelligent Systems (IS) project Spacecraft Micro Robot Automated Science Investigation Using Mul- tiple Rovers An Onboard Scientist for Multi-Rover Sci- entific Exploration A Hybrid Discrete/Continuous System for Health Management and Control · Information Technology Strategic Research (ITSR) project Quantum Dot Infrared Photodetector (QDIP) Focal Plane Arrays for NASA Applications Nanoscale Acoustic Sensors Using Biomi- metic Detection Principle High-Throughput Metabolic Profiling by Multidimensional Nuclear Magnetic Reso- nance and Mathematical Modeling of Meta- bolic Networks Advanced Semiconductor Lasers and Pho- tonic Integrated Circuits Intelligent Flight Control Space Communications (SC) project Reconfigurable Antennas for High Rate Communications Liquid Crystal Based Beam Steering Internet Protocol (IP) Infrastructure for Space Systems Micro-Surface Wireless Instrumentation Systems Radio Frequency (RF) Microphotonics Efficient Deep-Space Laser Communica- tions High Efficiency Ka-B and Metamorphic High Electron Mobility Transistor Monolithic Mi- crowave Integrated Circuit High Efficiency Miniature Traveling Wave Tube Amplifier . iWhile the Liquid Crystal Based Beam Steering task could use a better understanding of space qualification requirements, the task is still considered by the panel to be world-class for its potential impact on space architecture. CNIS project Grid Infrastructure Support and Develop- ment User Services IS project Model-Based Programming Skunk Works Mind's Eye: Knowledge Discovery Process Capture Automated Discovery Procedures for Gene Expression and Regulation for Microarray and Serial Analysis of Gene Expression Data Robust Intelligent Systems Based on Infor- mation Fusion ITSR project Low Dimension Nanostructures and Systems for Devices and Sensors SC project Backbone Network Communication Archi- tectures Distributed Space Communications Sys- tems Large-Scale Emulations During the course of this review, the CICT pro- gram demonstrated that it had taken appropriate action and either terminated or redirected these nine tasks (Tu and VanDalsem, 2003~. Finding: Most of the work being conducted under the CICT program is good, NASA-focused research. GENERAL OBSERVATIONS The CICT panel made some observations on mat- ters of concern that showed up in the CICT program. These observations are general, and there are numer- ous exceptions to them within the CICT program. Research Program Architecture The CICT program would be more uniformly ef- fective if the communication lines between program-

PANEL ON COMPUTING, INFORMATION, AND COMMUNICATIONS TECHNOLOGY level management and task-level PIs were clearer and better established. Problems with communication were evident to the panel during its information-gathering phase. Also, the panel sensed that, to some extent, CICT management was required to "force fit" a top- level research vision onto disparate research tasks that it had inherited from other programs (Tu, 2002~. The CICT panel believes that NASA could address this con- cern by using a research program architecture for the current CICT program as well as an architecture that identifies future targets. Such a program architecture is a framework that would clearly define the program's scope (what is included in the program and what is not), the relationships among the components within the framework, and the principles and guidelines by which the components are to function. This framework should be applied cautiously, how- ever. NASA should ensure that there are organizational mechanisms in place that allow for research, inspira- tion, and radical advances to shine through in a bottom-up manner. The framework would help CICT management to (1) organize interrelationships and de- pendencies among related research investments, (2) distinguish redundancies from complementary ef- forts, (3) understand where program gaps exist, and (4) describe the key technologies addressed by research projects. The architecture would also help CICT man- agement alter the course of research based on tasks that generate solid results. Gaps between the actual and the desired state of task completion would identify defi- ciencies as well as high-payoff areas for future research investments. The CICT panel derived a set of key tech- nologies, which it listed in the first column of Table 3-2 for NASA's consideration. In addition, it appeared to the CICT panel that some of the tasks should have been described as a product development effort rather than a research effort. In a research program architec- ture, CICT management should clearly and correctly identify what is research and what is development and speed the movement of research activities into devel- opment as appropriate. Recommendation: CICT management should es- tablish clear research program architectures to im- prove communication between top-level manage- ment and the task PIs, as well as to improve the overall effectiveness of the program. 23 Service-Oriented Tasks The pathway from research to development to ser- vice is generally not well defined at the task level within the CICT program. On several occasions, the panel identified tasks that originally started as research and produced very good and useful engineering or re- search tools. Once the tools were established, the task within CICT became one of maintaining the tools for use by NASA as a whole (Alfano, 2002~. Two ex- amples of such activities are the tasks (1) Grid Infra- structure Support and (2) Development and User Ser- vices, both under the CNIS project. These two tasks are of questionable value to NASA's core research and de- velopment mission, since the basic research portion of the project is complete. The CICT review panel strongly believes that CICT management needs to establish a mechanism to quickly transition final products, such as grid tools, to a service unit or entity outside the CICT program. This service unit can then maintain the infrastructure of the tools so that the rest of NASA, and even researchers from CICT, can then use them. Of course, the service unit may naturally consult and seek guidance from the original tool developers from CICT when engineering changes to the tool are required. Recommendation: To establish a more effective re- search program, CICT management should peri- odically review all CICT tasks to ensure that they are centered on productive research and develop- ment efforts. Any tasks that are providing a service, or those for which the research component is com- plete, should be quickly transferred out of the CICT program. In response to the interim letter report of the PRT committee (NRC, 2003), CICT implemented a new management practice namely, that most tasks under the CICT program will be reviewed by external peer review panels in the same manner that NASA NRA proposals are selected (Tu and VanDalsem, 2003~. The CICT panel commends NASA for taking this strong action but cannot yet assess the effectiveness of the peer review since it had not been conducted at the time of this report. The panel does, however, encourage NASA to reinforce the message to the advisory panels being formed that there should be a clear delineation between service-oriented tasks and research and devel- opment tasks, as discussed here.

24 AN ASSESSMENT OF NASA 'S PIONEERING REVOLUTIONARY TECHNOLOGY PROGRAM TABLE 3-2 Relationship of Technology Expertise Areas to NASA Abilities and Goals Technology Area Relevance to Selected Areas (Project) CICT Status NASA Mission Positives for Improvement High-performance Led by industry High-performance CICT has considered Can be connected more closely computing (CNIS) (hardware, especially computing directly some unique and difficult to broader high-performance Japanese) and consortia applies to many NASA problems (e.g., large- computing community. Work (software), not CICT. issues. scale shared memory). closely with appropriate standards organizations to influence emerging standards. Networking (CNIS) Following industry NASA is anetwork- CICTis paying some Can work with network leadership. dependent organization. attention to monitoring industry partners to transfer and improving network technology. Work closely with utilization. appropriate standards organizations to influence emerging standards. Algorithms for NASA (not industry) scientific computing is problem focused. (CNIS) NASA has extensive developments in scientific and Continuing to improve the algorithms that are core NASA scientific engineering applications. applications. Can shift from current focus, which is incremental improvements to existing algorithms, toward inventing new fundamental approaches to additional problems. Distributed Cooperating with others Improves usefulness of Working to extend Since the technology is rapidly computing to establish the state current NASA services, use the maturing to the point of (CNIS) of the art but has computation resources. capabilities, and transfer relatively few new research matured to the point of technology. opportunities, can be more general deployment. expected to transition to general deployment. Autonomous NASA is the robots (IS) international research leader on mission- specific applications. Essential for unmanned Integration of multiple missions. disciplines into a coherent whole. Excellent experimental processes and demonstrations. Possible to improve collaboration between NASA and university researchers. Planning and NASA is the Essential for robotics, Multiple approaches Canimprove collaboration scheduling (IS) internationalresearch on-board activities, being investigated. Being betweeninternal NASA leader. and mission planning. implemented in late researchers on preferred 2000s missions (both techniques for planning and mission planning and scheduling. Can also develop robot task planning). detailed understanding of Supporting excellent the most appropriate university research. application of different approaches.

PANEL ON COMPUTING, INFORMATION, AND COMMUNICATIONS TECHNOLOGY TABLE 3-2 (continued) 25 Technology Area Relevance to Selected Areas (Project) CICT Status NASA Mission Positives for Improvement Data mining (IS) At one time NASA was NASA has significant Applying different data Can regain leadership status to an international leader, internal needs to analyze mining approaches to address NASA-specific but the agency has lost complex engineering specific projects to problems (massive amounts of several key personnel. data and imagery. determine preferred complex data gathered rapidly usage patterns. Providing and/or remotely). Pay tools for end users additional attention to the full (technology transfer). end-to-end data mining Using mining techniques process (initial gathering to for scientific/engineering analysis to interpretation to data. archiving). Can increase work in visualization (just a couple of CNIS tasks) to complement the data mining activity. Improve tie between current application areas to NASA projects and NASA scientists (e.g., work with biology data). Human-computer CICT follows Area is highly relevant Having such a program is Can concentrate on longer- interface (IS) rather than leads to astronauts, operations, essential to NASA. term tasks since current tasks research directions and design/engineering. Looking at alternative are quite short term. Can and trends. Stronger in the astronaut input modalities (albeit at address fundamental issues in and operations areas. a low level of effort). collaboration and Little evidence of visualization. progress with design/ engineering. Software validation An international leader Highly reliable software Application to real Positive results to date indicate and verification in applying formal is essential to NASA. problems with some that NASA-wide interest will (ITSR) methods and techniques. success. Understand expand rapidly. Need to problems of scale. Cadre consider and plan for the of skilled practitioners daunting task of making developed. validation and verification a NASA-wide effort. New computing Neophyte in an Supports long-term need Learning about field. Can attempt to understand the paradigms (ITSR) emerging field. Unlikely to find faster ways to Low level of expenditure. nature of NASA missions at to impact NASA compute. least 10 years in the future to missions in next 10 years. determine applicability of the new computing paradigms. Nanotechnology Beginning to develop Stronger relevance will Experimental efforts on The very strong emphasis on (ITSR) skills in a specific area. emerge with clearer carbon nanotube research carbon nanotube research definition of the to validate theory. should be continuously relationship between Impressive nanostructure scrutinized for its ultimate ongoing researchin etching and cryogenic practicality. Other CICT and research camera technology. microsystem technologies being conducted in other should be considered and NASA areas (e.g., weighed against the carbon sensing materials). nanotube work. Continues

26 TABLE 3-2 (continued) AN ASSESSMENT OF NASA 'S PIONEERING REVOLUTIONARY TECHNOLOGY PROGRAM Technology Area Relevance to Selected Areas (Project) CICT Status NASA Mission Positives for Improvement Space communications hardware (SC) Leader in developing and using NASA unique missions. technology. Space communications Limited part of the protocols (SC) CICT portfolio. missions. Essential to continued Essential to continued Understanding of real Can plan for an increase in problems with broadly activities to accommodate applicable techniques. high-bit-rate transmission for Developing new techniques long-distance missions in the in collaboration with future. industry. Accept standard protocols Can develop experiments to to accommodate long-term improve current standard nature of NASA missions. practice. Can support improved protocols in space-to-earth transmissions that will accommodate large datasets and extended delays. Final Research Applications The CICT panel observed on numerous occasions what seemed to be a lack of understanding of the re- quirements for final application of the work being con- ducted, be it aeronautics or space. In particular, there was often little understanding of the requirements for space qualification of certain hardware and software (Tu, 2002~. Indeed, it sometimes appeared as if snace deployment was not a measure of success for some of the tasks even though the clearly stated long-term goal of such research was for hardware or software to be placed on space vehicles. In addition, some task plans said little about how to transition a task from research to deployment, even when these tasks were being conducted in support of a specific mission. The tasks Liquid Crystal Based Beam Steering and Multibeam Antennas, both under the Space Communications project, and the Flexible Ac- cess Networks element are examples of undertakings where a greater understanding of the space qualifica- tions requirements for hardware and software would benefit the work being conducted. It was not clear if this deficiency was caused by insufficient interaction with mission program managers or was simolv an over- sight on the part of the researchers. 1 ~ Understanding the demands of the environment in which a research product may operate can easily change the research approach. For example, knowing that the Federal Aviation Administration (FAA) has to ultimately certify onboard pilot advisory systems might lead researchers to discover techniques that are more amenable to certification processes. Or, if a researcher knows that a data set must be analyzed within certain time and memory constraints, he or she could adopt techniques that would be more amenable to satisfying these constraints. Recommendation: To ensure that task goals are properly oriented, CICT management should en- sure that principal investigators and managers clearly understand the requirements of the environ- ment in which the research products will be used. This is especially important for tasks whose stated goal is ultimately to place a hardware or software product in space. Final Products and Research Benchmarks Task deliverables are important long-term bench- marks of success. Without them, it is difficult for man- agers to judge the effectiveness of a research program. While the majority of tasks under the CICT program were good, a subset of tasks often did not have clearly defined products or system deliverables or clearly iden- tified customers. Even under a pure research agenda, benchmarks for success should be established early in the process by task PIs in coordination with the even- tual customer for the research. Put another way, if the PI has a specific application with a potential internal or external customer, that customer should be involved in setting the benchmarks for the task. If each task has a clearly defined deliverable or measure of success,

PANEL ON COMPUTING, INFORMATION, AND COMMUNICATIONS TECHNOLOGY CICT management will be able to manage the program more effectively. Recommendation: To manage the technical quality of work more effectively so that research tasks are meaningful and on track, CICT management should ensure that each task has a clearly defined, realistic yet challenging measure of technical success. RESEARCH PORTFOLIO All four projects under the CICT program SC, IS, ITSR, and CNIS are working to develop revolu- tionary technologies and technology solutions to en- able fundamentally new aerospace capabilities and mis- sions (Venneri, 2001~. The CICT panel verified its expectation that the four project areas would cover very different kinds of tasks and fundamental technologies. Specifically, SC covered the hardware and protocols for communicating and transmitting data in space. IS covered autonomous robots, planning and scheduling, data mining, and the human-com- puter interface. ITSR covered software validation and verifi- cation, new computing paradigms (e.g., quan- tum, evolutionary, and bio computing), and nanotechnology. CNIS generally covered research in high-per- formance computing, networking, algorithms for scientific computing, and distributed com- puting. The portfolios of the four projects contain research tasks that range from concept development to applica- tion development. The CICT program has a reasonable balance between fundamental research and applied re- search. The portfolios are also characterized by differ- ent expertise levels when contrasted with the outside technical community. For example, NASA has led the country in work on autonomous robots and the meth- ods by which they operate. It has maintained its posi- tion as an international research leader in mission-spe- cific robotic applications for over a decade. On the other hand, universities, industry, and national labora- tories have performed and currently lead the field in fundamental research in microelectromechanical sys- tems (MEMS) and nanotechnology, so that in this case 27 NASA is not leading the research charge. Rather, NASA is investing, justifiably, in nanotechnology to assess possible applications and determine methods that will infuse this new technolo~v into NASA orod- ucts and missions. Such differences are natural. NASA will lead in some research that is mission-critical either by work- ing on it in-house or by outsourcing and will follow in other research that may become mission-critical in the future. The panel believes that it is essential to main- tain this perspective when attempting to assess the value of the entire CICT research portfolio. This chapter looks at each technology area from the standpoint of how NASA is or is not positioned to lead or exploit that area; strengths and weaknesses in the general tasks within each area; and those areas that require additional NASA attention in order to improve. Detailed Assessment of Research Portfolio The panel has determined that the overall CICT research portfolio contains good research projects that support NASA objectives. Four technology areas (com- prising multiple tasks) are world-class (criteria listed in Chapter 2~: . Autonomous robots (IS) Planning and scheduling (IS) Application of software validation and verifi- cation (ITS R) Space communications hardware (SC) These technology areas are generally driven by a need unique to NASA that is not being fulfilled by in- dustry, academia, or other government agencies. The panel urges CICT management to examine these areas in detail so that other segments of the CICT program may emulate their success. The status of these and other technology areas within CICT and their relevance to NASA missions are presented in Table 3-2, under "selected areas for im- provement." The panel suggests possible future direc- tion within each technology area. However, these sug- gestions are not intended to imply that there are deficiencies throughout the CICT program. Finding: The overall CICT research portfolio is very good and supports NASA objectives. Four technology areas (comprising multiple tasks) in CICT were judged world-class: autonomous robots,

28 AN ASSESSMENT OF NASA 'S PIONEERING REVOLUTIONARY TECHNOLOGY PROGRAM planning and scheduling, software validation and verification, and space communications hardware. Overlap with Other PRT Programs . As the CICT panel assessed the overall PRT port- folio and the CICT program's role within PRT, it faced the challenge of understanding the manner in which the portfolio was organized and evaluated. The panel observed that some projects in the ECT and CICT pro- grams appeared not to be clearly bounded. This was especially true for nanotechnology. The CICT panel examined the overall CICT research portfolio and also had high-level exposure to the ECT and ECS programs, the other two programs that make up PRT. CICT panel members did not, however, receive information on the full scope of research in the PRT program or across NASA. There may well be research in other parts of the PRT program and NASA, such as research in MEMS for microsensors, distributed and microspace- craft, and intelligent systems, that might alter some of the recommendations of the CICT panel. Recommendation: The CICT panel recommends that CICT and PRT management act to ensure (1) that there is adequate communication between re- lated groups in ECT and ECS, (2) that the overall research portfolio is well balanced in areas of po- tential overlap, and (3) that all task PIs working in the areas of potential overlap are aware of the high- level goals for their research. Expanding Existing Research Areas In analyzing the CICT portfolio, the panel occa- sionally struggled with the definitions and scope of specific CICT expertise areas. To clarify the analysis, the CICT panel discussed specific aspects of "working in the small" in the case of nanotechnology, and "work- ing with people" for human-centered computing. The following two sections provide some ideas for NASA to consider when looking to expand the scope of re- search areas. Nanotechno/ogy: Working in the Sma// In the CICT program, research to bring about smaller, better-performing, cost-effective systems is plainly consistent with the NASA mission and the gen- eral field of nanotechnology. Most of the funding bear- ing the nanotechnology label under the CICT program is directed toward basic material science studies of car- bon and carbon compound nanotube materials (Alfano, 2002~. Giant steps need to be taken, however, before this research area can produce hardware of use to NASA. Nanotechnology is far less certain to be incor- porated into NASA missions than, for example, Microsystems research based on more established tech- nologies and materials. The panel believes that the nanotechnology work in the CICT program is very nar- row in its scope and that, by itself, the work seriously overlooks important, promising research areas such as those focused on lightweight, high-strength materials that are of obvious relevance to NASA for launch into space. The panel believes that there is significant work being done within NASA, but outside the CICT pro- gram, on a variety of MEMS and in areas sometimes classified as nanotechnology. Even given the limited purview of the CICT review panel, it appears that the nanotechnogy work under the CICT program is too narrow In scope. Recommendation: CICT nanotechnology research efforts should be assessed in terms of their potential contributions to NASA missions. More direct focus on potential applications is needed as well as coor- dination between programs that could interact to provide advances in Microsystems. Human-Centered Computing: Working with People The CICT panel defined "human-centered comput- ing" to include the assessment of the impact of com- puting technology on people as well as the develop- ment of tools and techniques that facilitate interaction between humans and computers. The ways people in- teract with computing systems are expanding rapidly. Figure 3-1 illustrates the expansion of the technology and user base but shows that research funding levels do not yet extend to technology areas where growth is an- ticipated. It is crucial to the NASA mission for NASA to have cutting-edge expertise in human-centered com- puting. Outside NASA, the considerable development in the human-computer interface area focuses quite naturally on the most frequent circumstance namely, that in which a single user deals with a midsized dis- play. For NASA, however, communication travels over a number of routes with disparate interface environ-

PANEL ON COMPUTING, INFORMATION, AND COMMUNICATIONS TECHNOLOGY _ -a ~ Bli--. '-------------- '' .................. ! ! !,! .--,. , ., ..~ it: Tonsil ^^r^~^ it ~ Hi. I! - ~~! =~! ~~! ~ 81 I~e 1 "~= ~ ~~! LarMe screen ir'lerface- y ~~ ~ ~ ~ ~ .... ~.~ ~ :.~. a... A^~ffl~%^l - ~^ merino tin ^~ - ~'^^r¢-'~`h~ ^~ll~h~ - to ~~"L~ ~ B~V I~" L" at ~ "Old-! ~ BY! I" Awl "L~ FIGURE 3-1 Future expansion of the technology for human-centered computing. meets. For example, mission control's display mecha- nisms differ greatly from those in the cockpit of a hu- man-occupied spacecraft. There is little work in the CICT portfolio focused on how scientists and engineers can improve their pro- ductivity through collaboration and collaborative envi- ronments (Yen, 2002~. It is essential to continue the efforts in human-computer interaction to evaluate and understand how NASA's people will work more effec- tively with computing systems. NASA must also con- sider a rapidly expanding and challenging environment for its people that goes far beyond the "single user with a midsized display" paradigm. Small display screens, which will be used throughout NASA both on earth and in space, still pose exceedingly difficult problems when used to display complex instructions or graphics. Such screens, as well as distributed human-computer interaction, are challenges that require additional work. In terms of the overall impact on people (both earth-bound and space-bound), research in how to work collaboratively is essential for increasing staff produc- tivity before, during, and after NASA missions. In ad- 29 i:: dition, the skill base for the highly technical work that NASA performs and contracts is located at laborato- ries across the country and must often be brought to- gether at a central location, virtual or physical. Much of the work the CICT panel described as world-class involved teams that are physically colocated, a charac- teristic that is becoming increasingly rare. The impli- cation here is that if NASA enables virtual coloration by using new collaborative technologies, more teams may reach world-class status. NASA has substantial skill in cognitive human fac- tors assessment. In fact, the team that has emerged over the course of the review is particularly strong, espe- cially in terms of its links to universities. There was little evidence, however, that the team's skills were being used to improve collaboration or to improve the usefulness and usability of new devices or the user in- terface paradigms. The review panel acknowledges that there has been progress improving the user interface for individual users for example, the Mars Explora- tion Rover (MER) or the International Space Station (ISS). However, the CICT program has not yet ad-

30 AN ASSESSMENT OF NASA 'S PIONEERING REVOLUTIONARY TECHNOLOGY PROGRAM dressed the significant fundamental research on dis- tance collaboration and alternative device evaluation that is essential to the entire NASA community. Finding: Collaborative work environments engag- ing geographically distributed users are becoming increasingly important to NASA's mission. These users will employ a wide variety of interactive de- vices. Recommendation: CICT should increase the in- volvement of NASA human factors experts in the cognitive evaluation of collaborative environments. To ensure that the new technology is used in the most appropriate manner for NASA missions and research, CICT should work on new graphics and interactive device technology. Critical Computing Expertise That May Be Missing Based on the CICT panel's understanding of the NASA mission and the impact of computing on the goals of NASA, there are some areas of computing that are critical to NASA's excellence as a globally and spa- tially distributed enterprise. The panel did not find these in the CICT portfolio (Hine, 2002; Yan, 2002~. This does not mean that such computing expertise is not covered in other areas of NASA. To be prudent, the panel points out these critical areas for NASA to re- view and act on appropriately. Distributed Data Management NASA scientists and missions generate terabytes of information that must be distributed and analyzed throughout the country. The CICT panel has observed that a significant amount of work is being done in this area at NASA Goddard Space Flight Center. Such work is fundamental research for projects in distributed com- puting in the CICT portfolio. In response to the PRT committee' s interim report released in January 2003 (NRC, 2003), the CICT pro- gram has planned for a large effort in distributed data management titled Knowledge Access and Discovery Systems (KADS), to start in FY2005. While a delayed start, the CICT panel commends the CICT program for planning this effort.2 2The panel understands distributed data management to include location, replication, access, and configuration management. information Systems Architecture The organization of interrelationships between in- formation system components is essential for more than planning and technology roadmaps. The development of information system architectures is an emerging dis- cipline.3 One very important goal here that NASA should carefully plan for is to ensure that all computing and data management software components developed under this architecture will work together. The archi- tecture should also ensure that when a system is placed into use, individual components can be installed or implemented with little to no disruption. In addition, new strategies are needed to make highly distributed, parallel processing work efficiently in both real-time applications and conventional applications. Recommendation: In order to make sure distrib- uted NASA computing systems work together, NASA should establish a carefully developed infor- mation systems architecture. RESEARCH PLANS AND METHODOLOGY This section is intended to evaluate the plans or methodologies by which the tasks within the CICT pro- gram are carried out. In general, the CICT review panel found the high-level goals of PRT to be well defined and relevant to NASA's mission (Tu, 2002~. PRT and its constituent programs, such as CICT, should also have clearly defined metrics. It was not clear to the CICT panel what the measurements for success are at the top level of PRT and its constituent programs. For instance, is CICT assessed against metrics such as tech- nologies transferred to missions, publications, and commercialization? As stated earlier, the CICT panel encourages all managers within the CICT program to establish clear metrics as a means of evaluating the tasks under their purview. Task Deliverables and Their Fit to NASA Goals The CICT panel found task deliverables other than those of the SC project to be poorly defined. The SC project was exemplary in that it generally had clear objectives, measurable outcomes, and milestones Information systems architectures will establish the implemen- tation framework, interrelationship, principles, and guidelines.

PANEL ON COMPUTING, INFORMATION, AND COMMUNICATIONS TECHNOLOGY (Bhasin, 2002a). In the other projects, however, the task PIs did not seem to have a clear view of how their task fit into a program mission. Many task statements that the CICT panel received, for instance, did not list any customers. As examples, the following tasks within the IS project did not mention customers on their task de- scription questionnaires. This list is not meant to be all- inclusive for the entire CICT program. Onboard Fault Identification for Planetary Rovers Domain-Specific Self-Adaptive Software Multi-Resolution Planning in Large Uncertain Environments Team-Oriented Robotic Exploration Tasks on Scorpion and K9 Platforms Probabilistic Reasoning for Complex Dynamic Systems Causal Reasoning Automated Data Management Distributed Data Mining for Large NASA Da- tabases Robust Intelligent Systems Based on Informa- tion Fusion This apparent disconnect between the task and NASA missions or even CICT program goals may be due, in part, to a lack of communication from top man- agement to the PIs. NASA managers should clearly articulate and communicate to PIs the mission and the potential customers for various programs, as discussed earlier in the report. The following recommendation appears earlier in this chapter, but it also applies here. Recommendation: To manage the technical quality of work more effectively so that research tasks are meaningful and on track, CICT management should ensure that each task has a clearly defined, realistic, yet challenging measure of technical suc- cess. Maturing a Technology It is vitally important that the excellent quality re- search CICT conducts eventually be transferred to a main mission, be it internal or external to NASA. Thus, the maturation process for a technology is very impor- tant. The CICT panel found that the process for matur- ing research was clearly articulated for research directly related to a well-defined NASA mission. It was, for 31 obvious reasons, more vague for research that is long term and not directly applicable to current NASA mis- sions. These long-time-horizon tasks with potentially high payoff (such as in CICT tasks on revolutionary computing and, in general, CICT's bio-nanotechnology efforts) are often at high risk of failure that is, they may fail to reach the stated project goals. It was also not clear to the CICT panel what pro- cess CICT has in place for allocating or deallocating resources to such long-term efforts. For instance, the quantum computing field will most likely not yield any technology directly usable by NASA in the next 20 years. While this is currently a good effort that is prop- erly being funded by NASA, the CICT panel had gen- eral questions about such long-term projects. Will NASA continuously fund quantum computing over the next 20 years? Has NASA the expertise to invest in the best research approaches in such an area? The process NASA uses for transferring a technol- ogy to an application was also not well defined for some of the technologies with broad applicability out- side NASA. Success for such technologies should be measured not only in terms of their deployment within NASA but also in terms of their broad deployment and use outside NASA. If broad deployment outside NASA does not take place, then future NASA missions will be burdened with providing continued support for NASA- unique technology, thereby missing the opportunity to leverage a broader external base of support. Another way to think of the problem is that NASA must choose carefully between developing the best technology for NASA and developing technology good enough for NASA but that will have a broader applica- bility and will not require a continuous investment stream from NASA. In such cases, success outside NASA that drives standards and pushes commercial- ization should be the main goal. This need to leverage outside investment seems to be recognized by most tasks within grid computing that are contributing to a broad community effort. The rec- ognition of this need is less apparent in the tasks in high-performance computing (a.k.a. advanced comput- ing), which seem to pursue many technologies that are similar to or perhaps the same as technologies being pursued outside NASA. The high-performance computing area is an excel- lent case study in the type of problem NASA faces when maturing a technology. The work in this area is currently embodied by two tasks at NASA Ames Re- search Center: (1) High-End Computing Architecture

32 AN ASSESSMENT OF NASA 'S PIONEERING REVOLUTIONARY TECHNOLOGY PROGRAM Research and (2) Research in Programming Paradigms. A significant amount of effort has gone into develop- ing a shared-memory programming model to support high-performance computing. NASA has an opportu- nity to take the lead on such development, but at this time it does not have the critical mass to successfully engage the broader community. It is essential that NASA make clear decisions on how to proceed with the transition. The CICT program has demonstrated that there is value in the approach it has taken with shared-memory models from both a hardware perspective and a soft- ware programming tools perspective. However, NASA cannot afford, in the long run, to follow a NASA- unique approach in this area. In the near term, it should focus on a broader effort that encourages others out- side NASA to adopt common shared-memory models and to develop a shared infrastructure of libraries and tools. There is strong interest in similar programming models in much of the high-performance computing community. Some examples of actions that NASA can take in- clude (1) normalization and extension of NASA bench- marks, (2) participation in standardization activities external to NASA, and (3) making a complete set of CICT's MLPlib library and associated programming support tools available to the broader community at no cost. Within 1 year of the onset of this activity, CICT should, at a minimum, be able to formulate an accept- able benchmark set and programming environment for the parallel libraries that NASA chooses to support. These benchmarks should be appropriate for use not only by NASA, but also by the general hardware and aerospace technical community. Finding: NASA has an opportunity to take the lead on shared-memory programming model develop- ment, but at this time it does not have the critical mass to successfully engage the broader community. Reviewing and Selecting Proposals CICT appears to have a good methodology for re- viewing and selecting proposals, although at the start of this review, it was not apparent to the review panel how labor is divided between internal and external re- views in the CICT program. There is also an inherent conflict of interest in having a NASA manager choose between keeping a task in-house that is, having NASA employees perform the work on the task and outsourcing it (where an external company performs the work) since that manager will be managing any in- house effort selected. Individual task owners have thought about their future plans in a reasonable man- ner. However, these future plans must be balanced with other suggested research, including that suggested by the CICT panel. The CICT panel commends the use of external re- views and of a competitive process for proposal selec- tion, as done by the IS project. Such a process leads to the selection of technically good proposals in defined areas. Some tasks, such as the tasks NSF Collaboration (under the ITSR project) and IF Infrastructure for Space Systems (under the SC project) seem to successfully take advantage of external reviews for assessing progress during task execution. CICT management should encourage this type of activity. Based on the interim letter report issued by the PRT committee in January 2003 (NRC, 2003), CICT man- agement decided that all tasks for the majority of CICT projects will be reviewed by peer committees, similar to the NASA NRA process. The panel believes that this is a step in the right direction and encourages CICT to keep an active peer review process in place for the entire program; however, the process by which reviews will take place has not been evaluated by the panel for effectiveness. The drawback to this type of review pro- cess is that it may not lead to a good mix of low risk and high risk of project failure and of short-term and long-term tasks, and the process may also not provide a rational allocation of resources for entering into new technology areas. It may be useful for the CICT man- agement to explicitly manage the allocation of re- sources between low risk and high risk of project fail- ure and short-term and long-term tasks for each technology area. There are three basic types of risk associated with tasks and elements: the risk of failure for a given task, the risk of a successful task not fitting into a larger system, and the risk associated with not starting a task or an element at all. In general, few tasks were rated as having a low likelihood of success in the written questionnaire responses. This may indicate a bias in the reporting, little investment in high-risk-of-failure tasks, or inadequate analysis of system-level risks. The CICT panel believes it is important to have a balance of risks, and it appeared that the CICT program could stand to pursue a greater proportion of tasks with a higher risk of failure. Risk of research failure can be managed using well-defined milestones as decision

PANEL ON COMPUTING, INFORMATION, AND COMMUNICATIONS TECHNOLOGY points for the continuation, revision, or cancellation of tasks. Such a risk management process (e.g., systems level analysis and customer knowledge) seems to be applied in certain projects, in particular the SC project, but not in others. A possible solution is for CICT to have a clean division between categories of risk and then clearly define the research tasks and the criteria for assessing success of the tasks within these risk categories. Exter- nal review panels, which CICT already plans to use, could then select proposals for tasks within these cat- egories, similar to the National Science Foundation (NSF) model, so that program managers can still exer- cise some judgment in the process. External advisory panels that mix people having a long association with NASA with people who are domain experts and have no significant interaction with NASA may be a good mechanism for identifying and initiating new technol- ogy areas in which to invest. This task selection pro- cess could allow CICT management to use external review as a positive tool in its program, while main- taining an appropriate risk balance in the research port- folio and providing a mechanism by which CICT can branch into new technology areas. Technology Readiness Level The CICT panel found that, in general, the PIs did not assess the technology readiness level (TRL) of their tasks in a consistent manner. The panel's impression is that many tasks were ranked too low on the TRL scale by the PI. The following is a short, random handful of examples: Visualization (CNIS): This task was ranked as a TRL 1 to 6, which is not very precise. Robust Intelligent Systems Based on Informa- tion Fusion (IS): This task is ranked TRL 1 by the PI. The work may well be fundamental and novel, but if it is successful, the path to actual deployment could be quite rapid. Evolutionary Algorithms for Scheduling (ITSR): This is an application of genetic algo- rithms to satellite scheduling and was ranked TRL 2 by the PI. The use of genetic algorithms for this type of scheduling problem is not new and appears to the CICT panel to be an appli- cation of a known technology to a new kind of problem. 33 Reorganization of Pro jests and Management Structure The CICT review panel has only addressed pro- grammatic issues as they arise from technical issues. The CICT panel has found that some projects within CICT, such as the SC project, seem to have a more coherent vision, better plans, and better project man- agement than other projects in the CICT program (Bhasin, 2002a, 2002b). The panel feels strongly that SC's positive performance reflects, in part, the relative stability of the SC project compared with projects that experienced frequent reorganizations. The reader may remember that the SC project had the highest number of world-class tasks, as reported earlier in this chapter. Stability is important. It is important to let plans ma- ture, allow management to track progress, and develop a coherent vision. Finding: The CICT program appears to be suffer- ing from too frequent reorganization. There is a di- rect link between the stability of a project and the project's technical performance. It is important that tasks be given time to mature under a consistent leadership. TECHNICAL COMMUNITY CONNECTIONS The CICT panel was charged with looking at how well the CICT program is linked to the technical com- munity at large. These are some of the questions asked in the statement of task: Is there evidence that the research plan for the area under review reflects a broad understand- ing of the underlying science and technology and of comparable work within other NASA units as well as industry, academia, and other federal laboratories? Is there evidence that the research builds ap- propriately on work already done elsewhere? Does it leverage the work of leaders in the field? Are partnerships, if any, well chosen and managed? · Is the research being accomplished with a proper mix of personnel from NASA, academia, industry, and other government agencies?

34 AN ASSESSMENT OF NASA 'S PIONEERING REVOLUTIONARY TECHNOLOGY PROGRAM A large number of tasks within the CICT program seem to be quite small in size and effort, with only one or two FTEs per task (based on the questionnaire re- sponses). This conclusion might be an artifact of the funding and reporting mechanisms used by NASA. It might also be indicative of too many efforts spread too thin over too many areas. If the latter is true, the panel encourages CICT management to have fewer but larger efforts. This strategy might improve the chances for having an impact, improve interactions and collabora- tions, and expand the involvement of management with external research projects. In particular, CICT should seek to establish more collaborations involving mul- tiple research institutes, as well as collaborations with other researchers inside NASA. The planned use of external review panels to select proposals, as discussed previously, may well reduce the conflict of interest problems that such collaborations often face. An ex- ample of a successful collaboration is Microfabricated Force-Detection Spectroscopy, where the PI is appar- ently able to leverage a modest 0.5 FTE to work on a task that promises to yield significant results. CICT management should strongly encourage task PIs to seek peer-reviewed publication in the proceed- ings of major conferences and workshops. This pro- cess provides an objective measure of research quality, gives NASA visibility in the research community, and provides useful peer feedback, especially for new, low- TRL areas such as bio-nanotechnologies, which seem to have relatively few publications considering the con- siderable effort being devoted to them. An example is the task Molecular Electronics un- der the ITSR project, which provides funds for three investigators, addresses a very-high-risk-of-failure re- search area, and still has no publications. By way of contrast, the task Computational Nanotechnology- Chemistry, being carried out by six investigators, has contributed several publications. As stated earlier, the central role given to carbon nanotube and related mate- rials research in the bio-nanotechnologies element should be carefully evaluated from time to time for progress toward NASA-mission-related applications. In a related example of peer review, the CICT panel commends a small advisory board formed at JPL to guide the work being conducted under the task Bio- logical Computing-BioInspired Information Process- ing and Exploration with Active Sensor Arrays. This research group has made significant improvements since its initial review by the panel (Tu and VanDalsem, 2003~. The research group now needs to expand its expertise by involving researchers in com- puter science, materials, and engineering design on the advisory board, to work on the possible deployment of new technologies. The CICT panel believes that such a formal advisory board will help the group establish clear technical standards for whether or not a research activity will yield products of interest and use to NASA. The panel also encourages CICT management to organize and fund workshops to enhance the program's interaction with the external world and expose outside researchers to NASA's problems. NASA did this ex- tensively in the past, and the panel encourages the CICT program to continue to do so. For example, NASA Langley hosted the Satellite Networks and Ar- chitectures Conference in Cleveland in June 1998. At the conference, NASA presented its work on TCP/IP communications over satellites. Commercial and uni- versity representatives gave talks on various aspects of data communications, with an emphasis on communi- cation protocols. This conference is an example of how NASA can engage the external community by having NASA researchers interact with their peers in industry and academia. The exemplary work conducted under the SC program within CICT testifies to the value of these types of exercises. The panel notes that the CICT program has taken steps in the right direction. For example, the chair of the IEEE Nanotechnology meeting held in August 2003 was a NASA Ames researcher working under the nanotechnology portion of the program. Finally, export controls are a reality with which NASA researchers must contend. Researchers should not use export controls as an excuse for the absence of publications, software distribution, and peer review. Researchers and management should anticipate such constraints and plan so that they do not impede the pub- lication of early CICT work or the distribution of soft- ware. Recommendation: To expose the external NASA technical community to NASA-specific issues and provide maximum leverage for CICT-funded tasks, CICT management should strongly encourage task PIs to seek peer-reviewed publication in journals and in the proceedings of major conferences and workshops. CICT management should also orga- nize and run technical workshops.

PANEL ON COMPUTING, INFORMATION, AND COMMUNICATIONS TECHNOLOGY Recommendation: PIs and CICT management should anticipate and plan so that export controls or other restrictions do not impede the publication of early CICT work. Awareness of Relevant Research The scientific work done in various tasks was gen- erally sound and frequently of good quality. Most PIs appeared to be aware of relevant research work inside and outside NASA. CNIS researchers, in particular, had a good awareness of relevant research. Awareness of Tasks Within NASA The CICT panel found that some areas within the CICT program could have better communication and collaboration between various tasks and between re- searchers and developers in other parts of NASA. Some tasks overlap, and the overlap should be recognized and managed by the CICT program. The PIs working on such tasks did not always seem aware of the over- lap. For example, the tasks MER Rover Sequence Gen- eration and MER Collaborative Information Portal, both under the Intelligent Systems (IS) project and both targeted at the MER mission to Mars, should be better coordinated than appears to be the case. Also, the tasks in the following lists seem to be closely related and appear to comprise different approaches to solving the same problem. If this is the case, there is nothing wrong with it, but the PIs for the tasks should be in close con- tact with each other and should be managed and coor- dinated by upper management. The reader should not infer that the following lists include all such overlaps across the CICT program. Examples of possible over- lap from the IS project are these: . . Artificial Collective Intelligence (Automated Reasoning element) and Adapting Coordina- tion and Cooperation Strategies in Teams (Hu- man-Centered Computing element). Robust Speech Recognition Using Dynamic Synapse Neural Networks (Human-Centered Computing element) and Advanced Spoken Dialogue Interface Systems (Human-Centered Computing element). There may be overlap among all of the following tasks within CNIS: 35 Grid Science Portals (Grid Common Services element). · Storage Research Broker Development and Support, and Grid Testbed Support (Grid Com- mon Services element). Development (Grid Common Services ele- ment). Visualization (Information Environments ele- ment). Grid User: Project Portal Development Envi- ronment (Information Environments element). One specific example not listed above, the Data Fusion IS-IDU-SHT task under IS and the Intelligent Data Understanding element, should be placed under the direct control of similar work being conducted at JPL and the University of Minnesota in the task Dis- covery of Changes from the Global Carbon Cycle and Climate System Using Data Mining. It should be noted that the CICT panel has deemed that these tasks centered on grid computing are worthy efforts and appropriate for NASA to pursue. However, the panel did not get a good understanding of how the various organizations within NASA will collaborate. There are many possible reasons why the top-level vi- sion is not clear and why the overlap discussed above is taking place. One might be that a lack of communi- cation between PIs keeps Ps from knowing the big pic- ture. Regardless of the reason, it should be up to the CICT management to identify causes and address issues. Awareness of Tasks Outside NASA A separate issue is the overlap of CICT efforts with work done outside NASA. In some of the tasks, the work is very specific to NASA missions, so little work being conducted outside NASA will apply to those missions. In almost all other tasks within the CICT pro- gram, it appears that PIs are sufficiently familiar with current research being conducted in their fields outside NASA. Better communication could improve the work on various tasks in the antenna lab at NASA Glenn Re- search Center. In some cases, researchers seemed un- aware of key relevant research performed elsewhere, as evidenced in the write-ups provided by each PI. Some tasks from the IS project that could benefit from a survey of the relevant technical literature are these:

36 AN ASSESSMENT OF NASA 'S PIONEERING REVOLUTIONARY TECHNOLOGY PROGRAM Machine Learning and Data Mining for Im- proved Intelligent Data Understanding of High Dimensional Earth Sensed Data. Robust Intelligent Systems Based on Informa- tion Fusion. Machine Learning for Earth Science Model- ~ng. Knowledge Discovery and Data Mining Based on Hierarchical Segmentation of Image Data. Use of Talent Inside and Outside NASA The CICT program does a good job of selecting external talent, from both academia and industry, to work or collaborate on tasks. As an example, the pro- cedure followed by the IS project for soliciting and choosing among proposals for work at low TRLs (Hine, 2002) seems to be very effective. Many of the external PIs chosen by this project are clearly active and well- respected researchers in their technical fields. The IS program should also be commended for having allo- cated a larger portion of its resources to establishing external community connections over the past several years. Many projects appear to involve a mix of internal NASA researchers and external researchers. It is es- sential to the long-term success to retain internal ex- pertise not only in technologies unique to NASA mis- sion success (e.g., autonomous robots, planning and scheduling, application of software validation and veri- fication, space communication hardware) but also in other critical technologies. This expertise is necessary to set the appropriate research agenda, to ensure the quality of results acquired from outside resources, and to integrate and assemble technology acquired exter- nally into NASA systems. The panel believes that all CICT projects should be open to competition and should consider external researchers. Internal and ex- ternal competition should be conducted separately and in a manner that encourages collaboration. There are also a number of research areas that are funded in part by NASA and in part by other agencies, such as DOD and NIH. The mix of personnel on these projects seems, in general, to be appropriate. There are many instances within the CICT program where the program would be benefited by expending some effort to maintain a direct connection between NASA and external researchers. In this way, CICT management can guide the external work so that, as much as possible, it is relevant to NASA missions. One way of doing this would be to make sure that NASA personnel familiar with mission goals and needs have a chance to work closely with external researchers to keep their research focused and relevant. Internal NASA research uses both civil service staff and contractors to perform the work. The relation- ship between the two groups in CICT was seamless, a good situation. Overall technical and management leadership should remain with civil service staff to guarantee project accountability. Recommendation: To maintain a strong research base, the CICT program should continue to encour- age a close connection between researchers and the external research community by, for example, en- couraging its researchers to attend conferences and serve as journal editors. Benchmark Datasets and Problem Sets In addition to funding research projects, CICT management could leverage the work of external NASA researchers by providing appropriate bench- mark datasets or problem sets. In this way, the work being conducted outside NASA would be relevant to NASA-specific problems at little or no additional cost. The release of such datasets would facilitate quantita- tive comparison of different research techniques, as well as encourage the broader community of research- ers who are not funded directly by NASA to consider NASA-relevant problems in their work. The task NSF Collaboration, under the ITSR Auto- mated Software Engineering Technologies element, is a good example of a task where such activity has taken place. Under this task, the PI jointly funded the cre- ation of a reliable software testbed. This was also dis- cussed earlier under the section on the CICT research portfolio in relation to parallel programming tools. FACILITIES, PERSONNEL, AND EQUIPMENT During the site visits and during various interac- tions with researchers throughout the course of the re- view process, the CICT review panel found the qualifi- cations of the CICT scientific staff to be very good and easily comparable to those of world-class researchers. As noted in the previous chapter, the external investi- gators that the CICT program has employed are also world-class and of high renown. The facilities and working environment are in a

PANEL ON COMPUTING, INFORMATION, AND COMMUNICATIONS TECHNOLOGY very good state of repair and on a par with other gov- ernment laboratories and facilities. All researchers ap- peared to have the tools and equipment they needed, in very good working order. The panel site visits did not observe unnecessary duplication or poor use of NASA- or contractor-furnished equipment or facilities In the case of JPL in Pasadena, California, some of the laboratory space was cramped for the number of researchers working there, but the CICT panel under- stands that laboratory space at JPL is very limited. And, in any case, the panel found that inadequate laboratory space did not impede their technical progress. In the view of the panel, NASA has done an excellent job. REFERENCES National Research Council (NRC). 2003. Interim Report of National Re- search Council Review of NASA's Pioneering Revolutionary Technol- ogy Program. Washington, D.C.: The National Academies Press. Avail- able online at <http://www.nap.edu/catalog/10605.html>. Accessed April 29, 2003. Venneri, Sam. 2001. NASA Aerospace Technology Enterprise, Strategic Master Plan, April. Washington, D.C.: National Aeronautics and Space Administration. 37 BRIEFINGS David Alfano, NASA Ames Research Center, "Information Technology Strategic Research Overview," presented to the CICT panel on June 12, 2002. Dennis Andrucyk, NASA Headquarters, "Office of Aerospace Technology FY2004 President's Budget," material provided to the committee on May 5, 2003. Kul Bhasin, NASA Glenn Research Center, "Space Communications Project Overview," presentation to the CICT panel on June 12, 2002(a). Kul Bhasin, "Space Communications Level IV Projects," presentation to the CICT panel on July 24, 2002(b) Butler Hine, NASA Ames Research Center, "CICT Intelligent Systems," presentation to the CICT panel on June 11, 2002. Eugene Tu, NASA Ames Research Center, "Computing, Information, and Communications Technology (CICT) Program Overview," presentation to the committee and panels on June 11, 2002. Eugene Tu and Bill VanDalsem, NASA Ames Research Center, "CICT Actions in Response to the NRC Review of NASA's Pioneering Revo- lutionary Technology Program Interim Report, dated January 16, 2003," material presented to the committee on April 21, 2003. Jerry Yan, NASA Ames Research Center, "Computing, Networking, and Information Systems Project," presentation to the CICT panel on June 12, 2002.

Next: 4. Report of the Panel on Engineering for Complex Systems »
Review of NASA's Aerospace Technology Enterprise: An Assessment of NASA's Pioneering Revolutionary Technology Program Get This Book
×
Buy Paperback | $46.00 Buy Ebook | $36.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The Committee for the Review of NASA's Pioneering Revolutionary Technology (PRT) Program and its three supporting panels were charged by the National Aeronautics and Space Administration (NASA) with assessing the overall scientific and technical quality of the PRT program and its component programs, along with their associated elements and individual research tasks. Major issues addressed in the review include (1) research portfolios, (2) research plans, (3) technical community connections, (4) methodologies, and (5) overall capabilities. As reflected in the organization of the report, a two-pronged assessment was developed. Each panel provided a detailed assessment of the program under its purview, which was refined and updated over the course of the review. The committee, composed mainly of representatives from each panel, integrated and evaluated the panel results and provided top-level advice on issues cutting across the entire PRT program.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!