Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
- SUMMARY 1 85 4.1 INTRODUCTION 188 Sir 4.2 THEORY, COMPUTATION,AND SUCCESS STORIES 189 Key Issues for the Next Decade 190 4.3 THE ROLE OF DATA EXPLORATION 191 Accompl ishments of the Past Decade 1 92 Problems and Solutions 1 92 Avai I abl e Resou rces 1 93 4.4 A COHERENT FRAMEWORK FOR THEORY, MODELING, AND DATA EXPLORATION 200 Coupling Complexity in Space Plasma Systems 200 The Challenges of Coupling Complexity 202 183
PANEL ON THEORY, MODELING, AND DATA EXPLORATION SUMMARY Today, space and solar physics presents both great opportunities and challenges, stemming from a science that is changing character from strongly exploratory and discovery driven to more mature and explanatory driven. Furthermore, the societal and economic impacts of solar and space physics, through the forecasting of space weather, have become increasingly important. The opportunities, challenges, and societal and economic impacts place significant new demands on theory, modeling, and data exploration. Theory and modeling act to i nterpret observations, maki ng them mean i ngfu I within the context of basic physics. Frequently, theory reveals that seemingly disparate observed phenomena correspond to the same physical processes in a system (or better yet, in many systems). Furthermore, besides their roles in organization and understanding, theory and modeling can predict otherwise unexpected but im- portant or relevant phenomena that may subsequently be observed and that might otherwise not have been d iscovered. This report of the Panel on Theory, Modeling, and Data Exploration provides a brief survey of key aspects of theory, modeling, and data exploration and offers four major recommendations. The survey is not exhaus- tive but instead emphasizes basic issues concerning the nature of theory and its integrative role in modeling and data exploration. The panel offers a new synthesis for the organization and integration of space physics theory, modeling, and data exploration: coupling complexity in space plasma systems. "Coupling complexity" refers to the class of problems or systems that consist of signifi- cantly different scales, regions, or particle populations and for which more than one set of defining equations or concepts is necessary to understand the system. For example, the heliosphere contains cosmic rays, solar wind, neutral atoms, and pickup ions, each of which interacts with the others but needs its own set of equa- tions and coupling terms. Similarly, the ionosphere-ther- mosphere and magnetosphere are different regions gov- erned by distinct physical processes. From this synthesis, the four major recommenda- tions flow naturally. They are designed to (1) dramati- cally improve and expand space physics theory and modeling by embracing the idea of coupling complexity (or, equivalently, nonlinearity and multiscale and multi- process feedback) within space plasma systems, (2) in- crease access to diverse data sets and substantially aug- ment the ability of individual investigators to explore 1 85 space physics phenomenology by redesigning the archiving, acquisition, and analysis of space physics data sets, (3) strengthen the role of theory and data analysis in space-based and ground-based missions, and (4) sup- port and strengthen the application of space physics research to economic, societal, and governmental needs, especially in areas where space weather and space cli- matology impact human activities and technological systems. THE COUPLING COMPLEXITY RESEARCH INITIATIVE For theoreticians, modelers, and data analysts, the great challenges of space physics often result from the closely i Intertwined and integrated coupl i ng of different spatial regions, disparate scales, and multiple plasma and atomic constituents in the solar, interplanetary, geo- space, and planetary environments. To embrace the de- mands imposed by hierarchical coupling or coupling complexity nonlinearity and multiscale, multiprocess, multiregional feedback in space physics space physi- cists must address a number of challenges: · Formulation of sophisticated models that incor- porate disparate scales, processes, and regions and the development of analytic theory; · Computation; · Incorporation of coupling complexity into com- putational models; · Integration of theory, modeling, and space- and ground-based observations; · Data exploration and assimilation; and · Transition of scientific models to operational sta- tus in, for example, space weather activities. Recommendation 1. NASA should take the lead in cre- ating a new research program the Coupling Complex- ity Research Initiative to address multiprocess cou- piling, nonlinearity, and multiscale and multiregional feedback in space physics. The research program should be peer reviewed. It should do the following: · Provide long-term, stable funding for a 5-year period. · Provide sufficiently large grants that critical- mass-sized groups of students, postdoctoral associates, and research scientists, gathered around university and institutional faculty, can be supported. · Provide funding to support adequate computa- tional resources and infrastructure for the successfully funded groups.
1 86 · Facilitate the development and delivery of com- munity-based models. · Use the grants to leverage faculty and perma- nent positions and funding from home institutions such as universities, laboratories, institutes, and industry. This research program would emphasize the devel- opment of coupled global models and the synergistic i nvestigati on of we l l -chosen, d isti nct theoretical prob- lems that underlie the basic physics inherent in the fully general self-consistent space physics problem. For major advances to be made in understanding coupling com- plexity in space physics, sophisticated computational tools, fundamental theoretical analysis, and state-of-the- art data analysis must all be brought under a single um- brella program. Thus, computational space physicists, theoreticians working with pen and paper, and data ana- lysts need to be part of a single research program ad- dressing a major problem in space physics. The models and algorithms developed by these research groups will make a major contribution to future National Aeronau- tics and Space Administration (NASA), National Science Foundation (NSF), and National Oceanic and Atmo- spheric Administration (NOAA) activities, especially those that focus on remote sensing and multipoint mea- surements. The models/algorithms will (1) couple mea- su remeets made at d ifferent ti mes and pl aces, (2) i n- tegrate the effect of multiscale processes so that the processes can be related to line-of-sight (column- integrated) remote-sensing measurements, and (3) pro- vide a framework within which large multispacecraft data sets can be organized. A fundamental component of this recommendation is that the award of a grant will carry with it the expec- tation of a commitment from the home institution (uni- versity, laboratory, industry) to develop a stable, long- term program in space physics by creating permanent positions; this would provide an intellectual environ- ment within which large research efforts can flourish and would allow for critical-mass efforts. Since nearly 30 groups submitted proposals to the most recent NASA Sun-Earth Connections Theory Pro- gram, the panel used this as an indication of the number of large groups that exist currently in the United States. Accordingly, itrecommendsthattheCouplingComplex- ity Initiative should support 10 groups, each with fund- ing of between $500,000 and $1 million per year. This would require committing $7.5 million to $10 million per year in funding. The panel recommends the forma- tion of a cross-agency commission, with NASA possibly taking the lead through its Living With a Star program, to THE SUN TO THE EARTH AND BEYOND: PANEL REPORTS examine the implementation of a cross-agency Coupling Complexity Initiative. THE GUEST INVESTIGATOR INITIATIVE Related to the five tasks listed in Recommendation 1, data and theory face challenges in two areas: · Integrating theory, modeling, and space- and ground-based observations and · Data exploration and assimilation. To address these points in the context of solar and space physics modeling and data analysis, the panel offers a second recommendation: Recommendation 2. The NASA Guest Investigator pro- gram should (1 ) be mandatory for all existing and new missions, (2) include both space- and ground-based mis- sions, (3) be initiated some 3 to 5 years before launch, and (4) be peer reviewed and competed for annually, with grant durations of up to 3 years. Funding, at a minimum equivalent to 10 percent of the instrument cost, should be assigned to the Guest Investigator pro- gram and should explicitly support scientists working on mission-related theory and data analysis. Further, the Guest Investigator program for each mission should have the same status as a mission instrument. Other agencies should also consider guest investigator initia- tives with their programs. The panel strongly supports and endorses the cur- rent NASA Guest Investigator program and would like to see it strengthened, with similar programs created in other agencies. The implementation of this recommen- dation would address the very real concerns expressed by many experimentalists that too few theorists play an active role in exploring, interpreting, refining, and ex- tending the observations returned by expensive mis- sions. The panel notes that in an era of "fast missions," an already active cadre of theorists and data explorers should be in place to take full advantage of a newly launched mission. Furthermore, a robust Guest Investi- gator initiative may also address the concern that NASA expects principal investigators for experiments to submit proposals with extensive science goals but does not pro- vide sufficient funding to support the science. At least 10 percent of the instrument cost should be assigned to the Guest Investigator program and should be budgeted in the mission costs from the outset. The panel recommends that Guest Investigator programs begin a few years prior to launch.
PANEL ON THEORY, MODELING, AND DATA EXPLORATION A DISTRIBUTED VIRTUAL SPACE PHYSICS INFORMATION SYSTEM This recommendation is intended to increase access to diverse data sets and substantially augment the ability of individual investigators to explore space physics phe- nomenology by redesigning the archival, acquisition, and analysis of space physics data sets. Recommendation 3. NASA should take the lead in con- vening a cross-agency consultative council that will as- sist in the creation of a cross-agency, distributed space physics information system (SPIS). The SPIS should link (but not duplicate) national and international data ar- chives through a suite of simple protocols designed to encourage participation of all data repositories and in- vestigator sites with minimal effort. The data environ- ment should include both observations and model data sets and may include codes used to generate the model output. The panel's definition of data sets includes simu- lation output and supporting documentation. ring: Among other tasks, the system shou Id do the fol low- 1. Maintain a comprehensive online catalog of both distributed and centralized data sets. 2. Generate key parameter (coarse resolution) data and develop interactive Web-based tools to access and display these data sets. 3. Provide higher-resolution data; error estimates; support) ng pl atform-i ndependent portrayal and analysis software; and appropriate documentation from distrib- uted principal investigator sites. 4. Permanently archive validated high-resolution data sets and supporting documentation at designated sites and restore relevant offline data sets to online sta- tus as needed. 5. Develop and provide information concerning standard software, format, timing, coordinate system, and naming conventions. 6. Maintain a software tree containing analysis and translation tools. 7. Foster ongoing dialogues between users, data providers, program managers, and archivists, both within and across agency boundaries. 8. Maintain portals to astrophysics, planetary phys- ics, and foreign data systems. 9. Survey innovations in private business (e.g., data mining) and introduce new data technologies into space physics. 1 87 1 0. Regularly review evolving archival standards. 11. Support program managers by maintaining a reference library of current and past data management plans, reviewing proposed data management plans, and monitoring subsequent adherence. The primary objective of this recommendation is to establish a research environment for the disparate data sets distributed throughout the space physics commu- nity. By providing a cross-agency framework that encompasses agency-designated archives and PI sites, the proposed space physics information system will enable users to identify, locate, retrieve, and analyze both observations and the results of numerical simula- tions. The community-maintained SPIS will facilitate the introduction of innovative data mining techniques and methodology from private business into the scientific community and improve communication between users and data providers. It will assist agency managers and Pls by archiving project data management plans and docu menti ng best practices. The panel recommends that SPIS begin at a modest level and grow only with demonstrated need and suc- cess. Periodic competitions for all tasks within the data system will help impose cost constraints. The central node would consist of a full-time project scientist, project manager, project engineer, and administrative assistant. The four discipline nodes employ half-time scientists, quarter-time administrators, and half-time pro- grammers. The central and discipline nodes provide the framework to link the various data sets. With the help of advisory committees from the research and operational communities, the central and discipline nodes will iden- tify and fund tasks at more ephemeral subnodes. A fully operational system might cost $10 million annually. THE TRANSITION INITIATIVE The applications of space physics research to eco- nomic, societal, and governmental needs, especially in areas where space weather and space climatology im- pact human activities and technological systems, need to be supported and strengthened. For models to be an effective bridge between space physics theory and eco- nomic, societal, and governmental needs, they have to address the demands imposed by coupling complexity and be sufficiently robust, validated, documented, stan- dardized, and supported. These additional demands impose significant challenges to the groups that develop models, and the criteria for transitioning a model suc- cessfully to operational use are frequently very different
188 from those needed to develop models purely for re- search purposes. Recommendation 4. NOAA and the Air Force should initiate a program to support external research groups in the transitioning of their models to NOAA and Air Force rapid prototyping centers (RPCs). Program sup- port should include funding for documentation, soft- ware standardizing, software support, adaptation of codes for operational use, and validation and should allow for the research group to assist in making the scientific codes operational. The RPC budgets of the NOAA Space Environment Center and the Air Force Space and Missiles Center/DSMP Technology Applica- tions Division (SMC/CIT) should be augmented to facilitate the timely transition of models. Despite the many solar, heliospheric, and geospace models that can potentially be used for operational space weather forecasting, relatively few have so far been transitioned into operation at the NOAA and Air Force space weather centers. This is due to inadequate resources to support transition efforts, in particular at the NOAA Space Environment Center. The recommended transition initiative addresses this problem directly. Costs, particularly for the Air Force, are difficult to forecast precisely. For NOAA, the panel estimates that $1 million would be required to start and support the first year of operation, with $500,000 per year to sup- port three permanent NOAA staff and an additional $500,000 for operational support, real-time data links, software standardization, documentation, and related expenditures. Since the panel anticipates that between three and five codes per year will be readied for transi- tion at a cost of ~$200,000 each, an additional $1 mil- lion per year should be available for competition. A conservative annual budget is therefore between $2 mil- lion and $2.5 mil lion. The transition initiative should be funded through a partnership between the Air Force and NOAA, with the precise levels of funding determined by the needs of each. The NOAA Space Environment Center and Air Force RPC budgets will have to be augmented and greater industrial and business support developed. 4.1 INTRODUCTION Space physics is positioned at a critical juncture. The past decade has witnessed a shift from a strongly THE SUN TO THE EARTH AND BEYOND: PANEL REPORTS exploratory and discovery-driven science to a more mature explanatory science. Furthermore, the societal and economic impact of forecasting space weather has become increasingly important. With its maturing, solar and space physics offers exciting opportunities and places significant new demands on theory, modeling, and data exploration. Theory and modeling act to inter- pret observations, maki ng them mean i ngfu I i n the con- text of basic physics. Frequently, theory reveals that seemingly disparate observed phenomena correspond to the same physical processes in a system (or better yet, in many systems). Furthermore, besides their roles in organization and understanding, theory and modeling can predict otherwise unexpected yet important or rel- evant phenomena that may subsequently be observed and that might otherwise not have been discovered. The dozen boxes throughout the report i 11 ustrate the comple- mentary roles of theory and observation in space physics. In this report, the panel briefly surveys key aspects of theory, modeling, and data exploration and offers four major recommendations. The recommendations are based on the deliberations of panel members, all of whom have been actively involved in the application of theory, modeling, and/or data assimilation to contempo- rary problems in solar and space physics, and on numer- ous interactions with the community of researchers and agency officials concerned with the issues outlined in this report. At its meetings, the panel received presenta- tions from a wide range of space physicists and agency officials. In reaching its findings, the panel drew on these experiences and on an extensive public dialogue with the broad space physics community. For example, the panel held town-ha! l-style meetings at various conferences and met with participants at the fall 2001 American Geophysical Union meeting, the Advanced Composition Explorer (ACE) workshop, and the NSF- sponsored meetings of the Geospace Environment Modeling (GEM) program and the Coupling, Energetics, and Dynamics of Atmospheric Regions (CEDAR) pro- gram. The panel also solicited comments via advertise- ments in electronic publications such as SPA News. This report is not intended to be exhaustive; instead, it emphasizes basic issues surrounding the nature of theory and its integrative role in modeling and data ex- ploration. Since examples illustrate themes far better than dry narrative, the panel offers representative space physics success stories (Boxes 4.1 to 4.10~. The stories i 11 ustrate the i mportance of theory i n synthesizi ng obser- vations and describe its role in driving, and being driven by, data exploration processes that can lead to the identification of new scientific frontiers.
PANEL ON THEORY, MODELING, AND DATA EXPLORATION From the survey, the panel offers a new synthesis for the organization and integration of space physics theory, modeling, and data exploration: coupling complexity in space plasma systems. Putting it another way, the panel explores the science of nonlinearity and multiscale and multiprocess feedback in space plasma systems. A pre- cise definition of coupling complexity is given in Sec- tion 4.4, "A Coherent Framework for Theory, Modeling, and Data Exploration." From this synthesis flows a set of four major recommendations designed to achieve four goals: (1) dramatically improve and expand space phys- ics theory and modeling by embracing the idea of cou- pling complexity (or, equivalently, nonlinearity and multiscale and multiprocess feedback) within space plasma systems, (2) increase access to diverse data sets and substantial Iy augment the abi I ity of individual in- vestigators to explore space physics phenomenology by redesigning the archiving, acquisition, and analysis of space physics data sets, (3) strengthen the role of theory and data analysis in space-based and ground-based mis- sions, and (4) support and strengthen the application of space physics research to economic, societal, and gov- ernmental needs, especially where space weather and space climatology impact human activities and techno- logical systems. 4.2 THEORY, COMPUTATION, AND SUCCESS STORIES Theory serves to identify fundamental problems and processes that cross traditional boundaries or divisions. It acts as a framework within which to interpret observa- tions; it can integrate apparently disparate observations and data sets, and it provides guidance in the develop- ment of tools, data sets, and the observations needed to obtain a complete understanding of heliospheric plasma physics processes. As space physics matures from an exploratory science, theory is beginning to assume an increasingly important role in defining the frontiers of the science, introducing new concepts that provide the framework for observational missions, observatories, and discoveries. Finally, space physics in general, as interpreted and guided by theory, occupies a unique place within the broader field of astrophysics in that it offers the only accessible laboratory in which to de- velop theories, build models, and test by observation plasma physics processes on all scales. All these ideas are exemplified by the success sto- ries presented in boxes throughout the text. Some of the 1 89 stories highlight the role that theory has taken in devel- oping new fields, and some show how theory and modeling can have an impact on society and the econ- omy. Others address, resolve, and develop new questions in basic theory, and yet others illustrate how questions that arise in space physics frequently translate into basic questions for physics or astrophysics. Numerical modeling has made dramatic advances over the past decade and is now considered by some space physicists to be a fourth branch of scientific meth- odology besides theory, expert meet, and observation. Several factors have contributed to this advance: 1. Moore's law continues to hold: Computational power has increased and continues to increase expo- nentially, doubling approximately every 18 months. 2. New and more accurate and efficient numerical algorithms have been developed. Examples include flux- limited fluid and magnetobydrodynamics (MHD) algo- rithms and implicit particle codes. 3. Many models are now three-dimensional (spa- tial), an essential capability for addressing many kinds of problems. 4. Numerical models have become more realistic by including additional physical processes and by using more realistic model parameters. 5. Models with different algorithmic strengths or models of interaction regions are being coupled. Ex- amples are codes addressing the coupling of the mag- netosphere-ionosphere and ionosphere-thermosphere. 6. Numerical models are directly compared with observations, and their predictive capabilities are tested. Computer modeling can be divided roughly into three classes, characterized as follows: 1. Modeling of a limited or local system or process with as much realism and/or mathematical accuracy as possible. The goal is to isolate a process, study it in detail, and extract the underlying physics. Sometimes the problem may be well posed mathematically, in which case the computer modeling approach tries to utilize the best numerical algorithms possible; but other times the problem may not admit a simple or tractable mathematical formulation, and the computer modeling may take the form of simulating the underlying physical processes i n the hope that the col lective behavior of the system is captured. Examples might be kinetic simu- lations of a reconnection-diffusion region, shock in- teraction modeling using high-order shock-capturing schemes, or turbulence modeling based on highly re- solved spectral codes.
190 2. Modeling that simulates large, possibly spatially or temporally complicated systems that can be de- scribed at a satisfactory level by a relatively simple, well-formulated set of mathematical equations (e.g., the MHD equations). The goal here is to solve the system in its full complexity (three dimensions, time-depen- dent, with realistic boundary conditions and initial data). This class of modeling requires the development of stable, robust, accurate, and efficient numerical al- gorithms to allow understanding of the physics of the spatially and temporally complex system as a whole, based on a relatively complete mathematical descrip- tion. An example would be the three-dimensional, ti me-dependent MH D model i ng of the magnetosphere. 3. Modeling that attempts to synthesize classes 1 and 2 by addressing complex coupled systems. The physical systems that are of interest frequently do not admit a single global mathematical model. Instead, quite distinct physical processes, although coupled at some level, may describe different regions or physical constituents. This class of computation attempts to bridge interactions between different regions, different processes, and different scales using different numeri- cal techniques and/or mathematical formulations. The goal is comprehensiveness and the development of an understanding of the complexity of the system. The modeling may include the coupling of different numeri- cal techniques and codes, each of which addresses a different plasma process and/or region. Examples in- clude global magnetosphere-ionosphere-thermosphere models, solar wind-CME models, and the interaction of the solar wind with the local interstellar medium. Cur- rently, because of I i mitations i n computi ng capabi I ities and resources, very complicated boundary or initial conditions or the long-time evolution of a global sys- tem cannot be included in this third class of modeling. Numerical modeling of solar and space plasmas, along with theory, drives the synthesis of observations (and aids active experiments in space physics, albeit to a lesser extent, owing to the rarity of such experiments). Computer modeling aids in the interpretation of data, in particular where the data coverage is sparse and where space-time ambiguities exist. Furthermore, nu- merical modeling has become a valuable tool for mis- sion planning. Examples include the development of orbital strategies in magnetospheric missions and plasma measurement requirements for Magnetosphere Multiscale. Finally, the panel observes that class 3 modeling forms the centerpiece of operational space weather forecasting. Currently, only a few models are THE SUN TO THE EARTH AND BEYOND: PANEL REPORTS used, but if meteorology is any indication, the number of models and their scope will increase dramatically in the next decade. KEY ISSUES FOR THE NEXT DECADE Continued advances in modeling will depend on the continued increase of available computational power. Moore's law is expected to hold over the next 10 years. Thus, CPU power will increase roughly 100 times (50 Gflops for the desktop, 50 Tflops for large systems), as will memory and disk storage. However, that power will have to be available to researchers in the field. This wi 11 require the avai labi I ity of dedicated large-scale com- puters for grand challenge problems, as well as Beowulf- class computers for individual researchers and groups. Computer hardware should be treated like hardware for experimentation, and sufficient funding should be made avai lable. NSF and NASA computi ng centers are insufficient. The two NFS centers the San Diego Super- computer Center and the National Center for Super- computing Applications serve most of the super- computing needs of the nation's academic research communities and are notoriously oversubscribed, typi- cally by a factor of between 2 and 5. Consequently the turnaround time for codes at these centers is very long, and scientific progress is severely impeded. While the NSF centers will likely remain the resource of choice for the most demanding computations, smaller installations that are available to a research group exclusively are significantly more cost-effective and can provide for most of the more typical computing needs. Dedicated funding I ines for model development wi 11 also be needed. Current NASA/High Performance Com- puting and Communications (HPCC) and NSF/lnforma- tion Technology Research (ITR) programs focus too closely on computational issues at the expense of basic science questions. The panel sees a danger in these pro- grams becoming excessively computer science oriented. Certainly, issues such as version control, validation, and interoperability are of importance, especially in the con- text of the increasingly sophisticated codes that will be used by multiple users or groups, but it is essential that the focus remain on basic science and the computa- tional tools needed to advance this science. The development of large scientific codes, rather like the development of experimental hardware, can require a lengthy gestation period. Most NASA, NSF, and DOE programs general Iy support the science that is done with large computational models but do not typical Iy support the development of such codes. Increased funding for
PANEL ON THEORY, MODELING, AND DATA EXPLORATION model development from group-size to principal inves- tigator (Pl)-size grants is essential. There are currently very few dedicated funding lines for model development. For example, the Mu Iti- disciplinary University Research Initiative (MURI) pro- gram, which funds many projects, is not likely to be extended. Because funding for model development is scarce, most requests are hidden in proposals whose objective is a particular scientific objective. For example, NASA SR&T and Sun-Earth Connections Theory Program (SECTP) grants and NSF GEM and Space Weather grants are usually awarded for the investigation of specific sci- entific issues, yet often some fraction of these grants is used for model development. However, these relatively small grants lead to fragmented model development efforts when much more focused development efforts are needed. This concealment of model development costs makes it virtually impossible to assess how much funding is actual Iy avai fable for model development. However, since the primary objectives of such grants are answers to science questions, the fraction of the funding devoted to model development is small, prob- ably at most 10 or 20 percent. Thus, there is a clear need to i ncrease fu nd i ng for model development from group- size to Pl-size grants. The modeling community will also have to become more open-source oriented. Codes should often be made available to the scientific community, but this requires clean coding, portability of codes, and documentation, aspects that are not generally addressed with scientific codes. Furthermore, there is little incentive for code de- velopers to address these issues because the tasks are work-intensive and little or no funding is provided to document and disseminate codes where appropriate. Data assimilation techniques, long considered essential in atmospheric and oceanic modeling, are only beginning to appear in solar and space plasma model- ing. Naturally, data assimilation is appropriate for class 2 computer models. Assi mi lative models wi l l be essen- tial to ingest and analyze data from forthcoming multi- satellite missions. Dedicated funding for data assim- ilation model development is required. As the panel describes below, such funding can be tied to the appro- priate missions by funding, for example, model devel- opment during the B. C, and D mission phases. Class 3 computations need to continue to couple different physi- cal models to properly address the complexity of the topical space plasma systems. Examples are solar dynamo-photosphere-chromosphere-corona-solar wind models and geospace models that include the ring cur- rent, radiation belts, and the plasmasphere. 191 Class 2 simulations have to be made more realistic by ensuring that parameters are as true to actual condi- tions as possible. For example, kinetic simulations have to achieve realistic ion-electron mass ratios. In the long run there will be a confluence of class 2 and class 3 models. By way of example, fully three- dimensional kinetic hybrid models embedded in three- dimensional MHD models will be used to investigate Earth's magnetosphere within the next one or two decades. 4.3 THE ROLE OF DATA EXPLORATION Projects supported by many U.S. government agen- cies have returned an unparalleled wealth of in situ and remote solar and space physics observations. Taken to- gether, these data sets form the basis for innumerable scientific studies designed to characterize Earth's upper atmosphere and plasmas throughout the solar system. Real-time and archived observations are in constant use to identify new phenomena and test the predictions of theory and si mu I ation. Researchers use observations at ever-increasing temporal and spatial resolutions to identify the fundamental microscale physical processes that govern entire systems, and multipoint and multi- instrument observations to determine the relationships linking solar phenomena to processes throughout the heliosphere, magnetosphere, and ionosphere. System- atic analyses of archived data sets characterize varia- tions in Earth's natural environment over periods both long and short (compared with the 11-year solar cycle) and identify the effects of manmade perturbations, in- eluding nuclear tests, ionospheric heating by ground radars, and plasma releases into Earth's atmosphere and magnetosphere. Solar and space physics observations also represent a national resource of immediate practical value to the United States. Both NOAA and the DOD currently use real-time observations of the Sun, the solar wind, Earth's magnetosphere, and Earth's ionosphere to predict the disruptive effects of space weather on ionospheric com- munication, aircraft navigation, radar defense, geo- synch ronous satel I ite safety, and commercial electrical power supply. DOE monitors the geosynchronous radia- tion environment to test adherence to nuclear test ban treaties. Data analysis has a number of other benefits. First, it defines the technology required for future instruments
192 and spacecraft. Second, solar and space physics data sets are eminently suitable for training students at both undergraduate and graduate levels in scientific method- ology. Third, certain data sets are intrinsically fascinat- ing to the general public, in particular images of the Sun and the aurora. Finally, solar and space physics data sets are becoming increasingly large. The need for rapid ex- change of similarly large data sets prompted the devel- opment of the Internet and the World Wide Web, and there is every reason to believe that this need will con- tinue in the future. ACCOMPLISHMENTS OF THE PAST DECADE Progress in solar and space physics often requires the correlative analysis of multiple data sets. The past decade saw exciting improvements in our ability to ac- cess data sets from different instruments, ground sta- tions, and spacecraft. With the advent of the Internet, each federal agency established or designated an online archive (see Table 4.1 ~ from which users could down- load data sets, view prepared survey plots, and inter- actively generate plots according to their own specifica- tions. Cost constraints and a need for simplicity require that the designated archives limit the data sets and the options for their presentation. However, cutting-edge re- search often requires innovative analysis of higher-reso- lution observations using special software. In general, this software is best developed by members of the prin- cipal investigator's research team, who are most familiar with both the instrument characteristics and user needs. TABLE 4.1 Designated Data Repositories Data Agency Center Web Address NASA NSSDC nssdc.gsfc.nasa.gov NSF NCAR cedarweb.hao.ucar.edu (for upper atmospheric data) NSF UMD www.polar.umd.edu (for Antarctic data) NSF NSO www.nso.noao.edu/diglib/ (for solar data) NOAA NGDC www.ngdc.noaa.gov DOD NGDC www.ngdc.noaa.gov DOE LANE Leadbelly.lanl.gov USGS NGDC www.ngdc.noaa.gov NOTE: NSSDC, National Space Science Data Center; NCAR, National Center for Atmospheric Research; UMD, University of Maryland; NSO, National Solar Observatory; NGDC, National Geophysical Data Center; LANE, Los Alamos National Laboratory. THE SUN TO THE EARTH AND BEYOND: PANEL REPORTS A number of Web sites appeared during the past decade that allow remote researchers to interactively plot and even analyze high-resolution observations from individual experiments. Several programs also took up the challenge of providing real-time observations in a manner suitable for both scientists and the general pub- lic, including the Solar and Heliospheric Observatory (SOHO), the Transition Region and Coronal Explorer (TRACE), and Reuven Ramaty H igh-Energy Sol ar Spec- troscopic Imager (RHESSI), which provide solar images; Polar, which gives auroral images; ACE, which observes solar wind plasma and magnetic field; geostationary operational environmental satellites (GOES), which ob- serve geosynchronous magnetic fields and energetic particles; and Super Dual Auroral Radar Network (SuperDARN), which makes ionospheric radar ob- servations. PROBLEMS AND SOLUTIONS Almost a decade ago, NASA and the research com- munity convened an open workshop at Rice University to discuss problems hindering the widespread availabil- ity of solar and space physics data sets and to identify possible solutions.4 Despite the progress noted above, many of the same problems confront researchers today. Before data analysis can begin, researchers must first locate the data sets they wish to analyze. During the 1 990s, efforts were made to develop a community-main- tained catalog of online data sets. However, these efforts lapsed in the absence of any agency sponsorship. Consequently, the only available catalogs are those maintained by individual archives for their own data set holdings. Even when users know that the data sets they require exist, there are few pointers to where they might be obtained. The various federal agencies should work together to maintain one or more comprehensive catalogs pointing to all relevant data sets and documen- tation. Many valuable data sets remain offline. Because multiple data requests impose a considerable burden on researchers at the locations where the data sets are stored, these data sets can be virtually inaccessible. ~NASA, Office of Space Science and App~ications, Space Physics Division. 1993. Concept Document on NASA's Space Physics Data System. NASA, Washington, D.C. See a~so <htip://spds.gsfc.nasa.gov/ wkshprpt.htm ~>.
PANEL ON THEORY, MODELING, AND DATA EXPLORATION When held solely within the community, offline data sets are in danger of being lost forever.2 As the costs of storage space and high-speed communication continue to decrease, it is in the interests of the principal investi- gators themselves to place the data sets online in re- sponse to user requests and the need for data centers to permanently archive the data sets. Program officials can speed this process by requiring proper data manage- ment plans in new proposals and by monitoring the availability of data sets when selecting new projects and considering their continued funding or renewal. Even more importantly, program officials should reward pub- lic service with enhanced funding. Even when data sets are publicly available, they may be unusable due to their complexity. A number of recent positive developments indicate ways in which this problem can be solved. Several research teams have begun making their own software available over the Internet. Particularly praiseworthy are efforts to enable outside users to access and portray complex h igh-reso- lution observations over the Web by running software remotely at the provider's institute. Efforts to develop platform-independent analysis and graphics tools for re- searchers inside and outside the principal investigator's team deserve to be given high priority by program man- agers. Since it can be costly to maintain software solely for external users, the best solutions provide the soft- ware used by the investigator team to external users. The data sets needed for correlative studies are fre- quently provided from multiple sources in different for- mats. Interactive tools that can draw upon these sources to portray multiple data sets and software that can trans- late differing formats are urgently needed. While it is heartening to note that efforts to remedy these problems are under way at a number of institutions, it is also clear that there is considerable duplication of effort. To mini- mize costs, the research community needs to establish a software library in conjunction with archival centers. In particular, it would be desirable to initiate supervised on I ine software trees I Ike the solar community's Solarsoft to encourage software sharing and compatibility. Final Iy, the rapid advances in numerical simulations described above imply that we will soon be making real-time space weather forecasts. Before becoming op- 2N RC. 2002. Assessment of the Usefulness andAvai/abi/ity of NASA's Earth and Space Science Mission Data. National Academy Press, Washington, D.C. 193 erational, the forecast models must be validated by com- paring their predictions with in situ and remote observa- tions. Any future data system must incorporate and sup- port the results of the numerical simulations in a format and manner that facilitate this comparison. AVAILABLE RESOURCES At least two programs currently provide resources to improve the solar and space physics data environment: NSF's ITR program and NASA's Applied Information Sys- tems Research Program (AISRP). The former supports research in systems design, information management, applications in science, and scalability. As this program is heavily oversubscribed, it has not had a significant impact on theory, modeling, and data exploration. By contrast, NASA's AISRP specifical Iy targets the space sci- ences and is responsible for many of the success stories discussed in Boxes 4.1 to 4.10. Its funds have been used to restore valuable data sets and place them online and to develop cataloging, depiction, and format exchange tools. Particularly praiseworthy is the fact that the funds have often been used for projects that cross agency bou ndaries. The recently inaugurated Virtual Solar Observatory (VSO)3 illustrates well how the diverse data sets held within the solar and space physics discipline can be integrated and studied in a coherent manner. This initia- tive responds to a recommendation from the National Research Council's Committee on Ground-based Solar Research,4 which called for NSF and NASA to collabo- rate in developing a distributed data archive with access through the Web by establishing a scalable environment for searching, integrating, and analyzing databases dis- tributed over the Internet. Given the increasing empha- sis on correlative space weather studies that seek to determine the terrestrial effects of solar disturbances, now is an appropriate time to expand this effort by es- tablishing a data system that encompasses the full range of solar and space physics phenomena. In Section 4.4 the panel describes plans for a virtual, distributed space physics information system. 3See <http://www. nso. noao.edu/vso/>. 4See Recommendation 7 on page 46 in NRC, Space Studies Board, ~ 99S, Ground-based Solar Research: An Assessment and Strategy for the Future, National Academy Press, Washington, D.C.
194 THE SUN TO THE EARTH AND BEYOND: PANEL REPORTS Over the past decade great progress has been made in modeling geospace that is, the global magnetosphere- ionosphere-thermosphere system and its interaction with the solar wind (SW) and the interplanetary magnetic field (IMP). While the construction of cartoons depicting the global magnetosphere-ionosphere-thermosphere system was essen- tially state of the art a decade ago, today's most sophisticated models are run with measured SW/IMF data as input and compared extensively with in situ observations. Figure 4.1.1 shows results from the University of California at Los Angeles/ NOAA geospace model for the Bastille Day geomagnetic storm in 2000 (July 14/1 51.The color rendering in the upper part of the figure shows the magnetosphere during the storm's main phase as it is compressed by the high solar wind dynamic pressure and eroded by the strong southward IMP Bz that is the hallmark of this storm. All three operational geosynchro- nous GOES satellites (marked by small red spheres in the figure) crossed into the magnetosheath at this time, as predicted correctly by the model.The lower part of the figure shows a comparison between the predicted (red) and observed (black) auroral upper (AU) and auroral lower (AL) indices of geomagnetic ground disturbance.The model is able to predict some of the intensifications and their magnitudes; however, it misses others, and clearly needs improvement. Global geospace models are now quantitative, and the new challenges are data assimilation, improving the physical realism, and metrics evaluations. =, I ~ I tip I t I I I I ~ I I I I I ~ ~ I I I ~ I I I I t I ~ I 1 T I ~ I I l ~ ~ , ~ I . . ~ , 1 1 ~ , I , , , ~ I , , ~ , ~ ~ I , 1 ~ . - 1:5 16 17 18 lug 20 ~21 22 :23 Time~hour~s) FIGURE 4.1 .1 SOURCE: J. Raeder,Y.L.Wang,T.J. Fuller-Rowell, and H.J. Singer. 2001. Global simulation of magnetospheric space weather effects of the Bastille Day storm.Solar Physics 204: 325-338.
PANEL ON THEORY, MODELING, AND DATA EXPLORATION 195 The physics of the outer heliosphere and its interaction with the partially ionized local interstellar medium (LISM) has become an exciting growth area in space physics. An important recent development in outer-heliospheric research is the prediction and discovery of the"hydrogen wall."Theoretical models predict that the partially ionized LISM and the solar wind are separated by a complex set of plasma and neutral-atom boundaries of enormous scale, located between ~90 and ~250 AU from the Sun. A specific theoretical prediction by the models is that a wall of interstellar neutral hydrogen should exist upstream owing to the relative motion of the heliosphere and the LlSM.This hydrogen wall is predicted to have a number density slightly more than twice the interstellar density, to be hotter than interstellar hydrogen, and to be some 100 AU wide.The physical reason for the hydrogen wall is the deceleration and diversion of the interstellar plasma flow about the heliosphere leading,through charge-exchange coupling,to a pile-up and heating of interstellar neutral hydro- gen (Figure 4.2.1 ).The result is the formation of a giant wall,which acts to filter neutral hydrogen as it enters the heliosphere. Confirmation of the hydrogen wall's existence was not expected for decades, but a serendipitous convergence of predictive theoretical modeling, observations to place limits on the cosmological deuterium/hydrogen ratio, and a multidisciplinary investigation spanning space physics and astrophysics led to the detection of the hydrogen! This is the first of the boundaries separating the solar wind and the LISM to be discovered, and it offers a glimpse into the global structure of the three-dimensional heliosphere. Radio emission from the heliopause, probably driven by global interplan- etary shock waves, also provides an opportunity to probe deeply into the remotest reaches of the heliosphere. An unex- pected astrophysics result to emerge from the recent work on the hydrogen wall is the first measurement of stellar winds from solarlike stars. The research leading to the discovery of the hydrogen wall is an excellent example of theory driving the frontiers of space science and motivating the development of new observational techniques and methodology to complement tradi- tional space physics tools. FIGURE 4.2.1 SOURCE: G.P. Zank. 1 999. Interaction of the solar wind with the local interstellar medium: A theoretical perspective. Space Science Reviews 89: 413-688.
196 THE SUN TO THE EARTH AND BEYOND: PANEL REPORTS The unexpected creation of new electron and proton radiation belts during the March 24, 1991, geomagnetic storm challenged scientists to rethink the formation and stability of Earth's radiation belts.This storm was initiated by an inter- planetary shock traveling at a remarkable 1,400 km/s. Fortuitous measurements by the Combined Radiation and Release Satellite showed that the belts appeared within minutes near the "slot region" at 2.5 RE as the shock rapidly compressed Earth's magnetosphere (the largest compression on record).The newly formed belts persisted until 1994. How did the belts form? The prevailing empirical diffusion models gave energization rates that were orders of magni- tude too slow, and they only described the intensification of existing zones of MeV particles rather than the formation of new stable belts. Advances in theory identified a new process involving fast resonant acceleration by inductive electric fields accompanying a rapid compression of the geomagnetic field.This theory emphasized the need for new approaches to predicting Earth's dynamic radiation belts and to space weather forecasting. One promising approach embeds a particle-pushing code for radiation belt particles in a global MHD simulation of the solar wind-magnetosphere interaction. Results for the 1991 storm (Figure 4.3.1) show that an initial outer zone electron source population, represented by the average NASA AE8MIN model in the upper right insert, was transported radially inward by the induction electric field to the sparsely populated slot region.This transport occurs on the MeV electron drift time of 1-2 min and produces a flux peak at 13 MeV for the 1991 event. A new proton belt was also formed at the same location by the same mechanism, but with greater energy (>20 MeV),trapping and transporting inward the extreme solar energetic proton source population produced by the interplanetary shock. .a~.: ~ _X: X: A ~ 1 ~ __ X 4 ~ _~ ~: ~ ~ . ~ ~ ~ ' ' I ~ it' ~ ~ '\ it: FIGURE 4.3.1 SOURCE: S.R. Elkington, M.K. Hudson, M.J. Wiltberger, and J.G. Lyon. 2002. MHD/particle simulation of radiation belt dynamics.Journal of Atmospheric and Solar-Terrestrial Physics 64(5-6): 607-615.
PANEL ON THEORY, MODELING, AND DATA EXPLORATION 197 Dynamo theory has made great strides in recent years, exploring the variety of dynamo forms that might occur in planets, stars, and galaxies. Recent comprehensive massive simulations of the dynamo of Earth would appear to have that problem well in hand. On the other hand there is a serious problem with the dynamo of the Sun (and of other stars and of the galaxies). The extensive analytical and numerical models of the solar dynamo incorporate the notion of turbulent diffusion as an essential part of the dynamo.The concept of turbulent diffusion of the vector magnetic field is based on an intuitive analogy with the turbulent diffusion of a scalar field, such as a puff of smoke.The analogy provides just about the right diffusion rate to provide the 1 1- and 22-year magnetic cycle of the Sun, so it has been widely accepted.The difficulty is that the mean fields in the lower convective zone are estimated now to be 3,000 gauss or more, and such fields are as strong as the convection. Hence the field is too strong to be drawn out into the long thin filaments to which a puffof smoke would be subjected. So it is clear that we do not understand the diffusion that is essential for the solar dynamo. Combining this mystery with the mystery of the frail structure of the magnetic field, observed at the visible surface and inferred in the lower convective zone,we must be cautious in accepting the theoretical dynamo models. It is imperative that we address the problem of diffusion and the fibril structure of the field. Helioseismology is the study of the structure and dynamics of the solar interior using the frequencies, phases, and travel times of resonant waves observable as oscillations at the photosphere.The evolution of the discipline is a striking example of the close and sometimes unpredictable interplay between theory and observation. Although the basic theory of nonracial stellar pulsations had been well developed in the early part of the 20th century to explain variable stars, it was never applied to quasi-static stars such as the Sun, and the first observations of solar oscillations in 1960 were in a sense fortuitous.Theorists had suggested evanescent acoustic waves propagating up through the atmosphere as a mechanism for the mysterious heating of the corona, and the unsuccessful search for such waves by looking for temporal variations in the Doppler shifts in Fraunhofer lines led to the discovery of surface oscillations in a band of frequencies around 3 mHz, lower than the frequencies of the coronal-heating waves sought.These so-called "5-minute oscillations" remained theoreti- cally unexplained until 1970, when it was recognized that they could be the photospheric response to trapped interior waves of the acoustic-gravity mode class, a prediction that was confirmed by measurement of the wave dispersion relation. It was soon realized that the oscillations are part of an enormously rich spectrum of modes, now known to encompass wave numbers from 0 (purely radial) to well over 2,000 over the solar circumference, and ranging from purely surface modes to those penetrating to the center of the Sun. By properly decomposing and analyzing the observed spectrum, it became possible in principle to directly measure the depth dependence of sound speed, density, and bulk velocity in the interior of the Sun and to measure their departures from radial symmetry as well. The last three decades have witnessed the emergence of the new discipline of helioseismology. Concurrent with the development of instruments and observing methods, a significant theoretical effort occurred to develop the complex analysis techniques required to find inversion methods with suitable resolution and precision, to enhance the resolution and physical complexity of the models of the structure and dynamics of the Sun and of its features, and to include wave phase information in the inferences. Finally, the necessity of analyzing many orders of magnitude more data than had hitherto been required has brought about major advances in the handling of data, and these have to some extent propa- gated throughout the field of solar physics. The ability to directly determine physical conditions deep in the interior of a star with astonishing accuracy (the sound speed is now known to about one part in 104 through most of the solar interior) has importance well beyond the immedi- ate fields of stellar structure and evolution. It allows us to use the solar interior for the exploration of physics of systems inaccessible to the laboratory.The verification of interior models to such precision meant that the solution of the problem of the deficit of solar neutrinos observed on Earth had to be sought in the realm of fundamental particle physics. The complex atomic physics required to calculate the radiative opacity, the equation of state, and the nuclear reaction rates is strongly constrained by the necessity of matching the observationally supported solar models.The description of turbu- lent convective flows and the magnetohydrodynamics of the near-surface regions of the Sun, crudely parameterized or ignored in structural models, are now subjects of active research supported by observational data. It will soon be possible to measure and analyze the internal oscillations of other stars in different states; a few have already been detected.This will truly allow stellar interiors to be used as a laboratory for physical exploration.
198 THE SUN TO THE EARTH AND BEYOND: PANEL REPORTS Near Earth the geomagnetic field is a dipole whose axis is tilted by about 12 degrees from Earth's rotational axis. As a consequence, the geomagnetic field is horizontal at low latitudes in the upper atmosphere and ionosphere (altitudes of 100 to 2,000 km). During the day, upper atmospheric winds produce dynamo electric fields that cause plasma on these geomagnetic field lines to drift upward, which leads to an elevated ionosphere with peak densities of 1 o6 cm-3 and peak altitudes as high as 600 km. At dusk, the ionosphere rotates into darkness, and in the absence of sunlight, the lower ionosphere rapidly decays.The result is that a steep vertical density gradient develops on the bottom side of the raised ionosphere.This produces the classical configuration for the Rayleigh-Taylor instability, in which a heavy fluid is situated above a light fluid (supported by the horizontal geomagnetic field). In this situation, a density perturbation can trigger the instability, and once triggered, the depleted densities at low altitudes bubble up through the raised ionosphere. The plasma density in the bubbles can be up to two orders of magnitude less than that in the surrounding medium, and the bubbles frequently drift upward with supersonic velocities. Typically, the bubbles develop into large structures. An ex- ample of a bubble structure is shown in Figure 4.6.1, which corresponds to a coherent backscatter measurement by the JULIA ground-based radar in South America.The bubbles can reach altitudes as high as 1,500 km, and the entire north- south extent of the magnetic field at low latitudes is usually depleted (30 degrees of latitude).The east-west extent of a disturbed region can be several thousand kilometers, with the horizontal distance between separate bubble domains being tens to hundreds of kilometers. Although plasma bubbles have been observed for more than 30 years, the exact trigger mechanism is still unknown, and the three-dimensional structure of the bubbles has not been modeled.The latter requires new models that rigorously incorporate the microphysics in large-scale models of the background ionosphere. 800.0 ~ 600.0 - 500.0 - 400 q0 - 300.0 - 1 0:00:00 ZQ:OO:OO 21:00:00 2Z:OO:OO Local Time ~ 996 SEP 06 0 ~ 0 20 30 SIN (db) 23:QO:00 24~VQ:OO FIGURE 4.6.1 SOURCE: D.L. Hysell and J.D. Burcham. 1998. JULIA radar studies of equatorial spread F. Journal of Geophysical Research 103(A12):29, 155.
PANEL ON THEORY, MODELING, AND DATA EXPLORATION 199 No project required the coordinated analysis of multiple data sets more than NASAL International Solar-Terrestrial Physics program, which was predicated on the need to identify, describe, and explain the multitude of physical relation- ships linking processes on the Sun to those in the heliosphere, magnetosphere, and ionosphere.The Space Science Data Facility at NASAL Goddard Space Flight Center (GSFC) developed an Internet-based plotting and data downloading tool, CDAWeb, that allows interactive users to select key (low-resolution) parameters from multiple ground- and space-based instruments for plotting as lines or images versus time.When interesting intervals are identified, users can download the data sets seen in the plots in different formats,together with supporting documentation.Similar servers provide spacecraft trajectories and heliospheric and near-Earth solar wind observations. CDAWeb provides over 1,300 parameters from 150 datasets and 25 missions. Usage has grown steadily with time and is now running at 6,000 plots and 2,000 data file listings per month. The server at GSFC has been mirrored by archive centers in Germany, the United Kingdom, and Japan. Figure 4.7.1 shows an example of multiple plot types.The plot was generated interactively over the Web by selecting the parameters, instruments, spacecraft, and time intervals for study. 10 Wind Magnetic Field Investigation Sec. 1 Minute, and Hourly Definitive Data o ~ 1o1o Q x 1 05 o 10° 10-5 1o1o Q 1 o6 x o 1o2 10-2 1 O,OOO N 5,000 I ~ 2,OOO 4,, 1 000 500 200 100 50 Wind 3-D Plasma Analyzer Key Parameters , . . . . . . . . . . . . . . . . . I I I I I Wind Radio/Plasma Wave, Radio 1 Minute Data 00:00:00 04:00:00 08:00:00 12:00:00 18:00:00 20:00:00 00:00:00 98 Sep 23 98 Sep 23 98 Sep 23 98 Sep 23 98 Sep 23 98 Sep 23 98 Sep 24 Time Range = 1998/9/23 (266) to 1998/9/24 (267) - 0.25 ~ Y -2.22 -19.76 0 . ~ - -o -50 >. -100 - 1 50 `1` -200 0 -250 100 1 0 ~ z FIGURE 4.7.1 SOURCE: Plot by R.W. McGuire, Space Physics Data Facility at NASA Goddard Space Flight Center.
200 4.4 A COHERENT FRAMEWORK FOR THEORY, MODELING, AND DATA EXPLORATION COUPLING COMPLEXITY IN SPACE PLASMA SYSTEMS Space physics has typically been organized accord- ing to distinct or discrete events or physical processes, and recommendations for future directions have been expressed in the context of these very specific (and somewhat isolated and idealized) problems. A very influ- ential example of this approach appears in the Colgate Report,5 which identifies six problems6 as vital to further understanding of space plasmas. Substantial progress has been made in solving the problems identified in the Colgate Report, but the basic questions remain with us.7 Nevertheless, progress in the field has also served to crystallize the complexity and therefore the chal lenge of resolving many of the important problems of space phys- ics. This complexity arises from the coupling across space and time scales (e.g., the generation of turbulence at boundary layers), the coupling of multiple constitu- ents (e.g., the interaction of the solar wind with neutrals in the interstellar medium), and the linkage of different regions (e.g., the ejection of magnetized plasma from the surface of the Sun and its subsequent interaction with Earth's magnetosphere). The success stories given in Boxes 4.1 to 4.10 and the discussions in the text related to computation and data exploration illustrate these couplings, as do the examples in the following . . c Iscusslon. Distinct plasma regions and regimes are invariably coupled in a highly nonlinear dynamical fashion, with the implication that each region or physical process can- not be considered in isolation. Multiple plasma regions can be coupled through events that transfer mass, mo- 5National Academy of Sciences (NAS), Space Science Board. 1978. Space Plasma Physics: The Study of Solar-System Plasmas, Vol. 1. NAS, Washington, D.C. 6These were (1) magnetic field reconnection; (2) interaction of tur- bulence with magnetic fields; (3) the behavior of large-scale plasma flows and their interaction with each other and with magnetic and gravitational fields; (4) acceleration of energetic particles, and particle confinement and transport; and (5) collisionless shocks. 7See, for example, N RC, 1 995, A Science Strategy for Space Physics; NRC, 1988, Space Science in the Twenty-First Century: Solar and Space Physics; and N RC, 1 998, Supporting Research and Data Analysis in NASA's Science Programs, all from the National Academy Press, Washington, D.C. THE SUN TO THE EARTH AND BEYOND: PANEL REPORTS mentum, and energy from one region to another, such as the eruption and subsequent propagation of a coronal mass ejection from the solar surface through interplan- etary space to perhaps Earth's magnetosphere. By contrast, a particular plasma region can admit multiple ion and atom populations, each governed by distinct plasma physical processes yet coupled to one another dynamically and self-consistently. An excellent example is the coupling of the solar wind to the local interstellar medium through the intermediary of neutral interstellar atoms (beyond some 10 to 15 AU, the domi- nant constituent of the heliosphere, by mass, is neutral interstellar H). Charge exchange serves to couple the plasma and neutral atom populations, yielding a highly nonequilibrated, nonlinear system in which the charac- teristics of both populations are strongly modified (pickup ions, anomalous cosmic rays, and very hot neutral atoms are some of the by-products created). Charge exchange is important, too, at solar system bodies such as Venus and lo, and of course energetic neutral atom imaging of the terrestrial magnetosphere has become an important new tool. The self-consistent coupling of disparate plasma regimes, each governed by possibly distinct plasma physical processes, is a chal- lenge that must be addressed if the current fleet of satel- lites and ground-based observatories and ambitious new initiatives such as LOOS, Solar Probe, and Interstellar Probe are to be successfu I Iy and fu I Iy exploited. Furthermore, the space plasma environment typi- cally possesses a multiplicity of spatial and temporal scales, and the nonlinear, dynamical, self-consistent feedback and coupling of all scales determines the evo- lotion of the system through the creation of large- and small-scale structure. Excellent examples are recon- nection and turbulence, where a marriage of large-scale, slow MHD behavior and fast, small-scale kinetic pro- cesses is needed to further our understanding of funda- mental nonlinear processes that arise in space plasmas. Understanding multiscale feedback in space plasma sys- tems will be one of the most important and challenging problems facing theorists and modelers over the next decade. Different plasma regimes (and possibly even scales) are often separated by narrow boundaries, and the cou- pling and structure of large- and small-scale processes frequently control the evolution of the boundaries. The quintessential examples are collisionless shock waves and auroral arcs. The sharp gradients and the coupling of disparate scales in boundaries continue to challenge our understanding and modeling of space physics problems.
PANEL ON THEORY, MODELING, AND DATA EXPLORATION 201 Data assimilation techniques were first used in numerical weather prediction, when meteorologists were confronted with having to solve an initial value problem without the right initial data. Specifically, there were insufficient synoptic observations to initiate a model run that could be used to predict the weather.To overcome this obstacle,the meteorologists developed a methodology that is now called data assimilation.This methodology uses data obtained at various places and times, in combination with a physics-based (numerical) forecast model,to provide an essentially time-continuous"movie" of the atmosphere in motion. During the last 40 years, meteorologists have continually improved their ability to predict the weather, both as a result of model improvements and because of a large infusion of new satellite and ground-based data. Following the example set by meteorologists, oceanographers began to use data assimilation techniques about 20 years ago. Recently, using a numerical model of the Pacific Ocean, in combination with a large number of distributed ocean measurements, oceanographers were able to successfully predict the coming of the last El Nina. The solar and space physics community has been slow to implement data assimilation techniques, primarily because it lacks a sufficient number of measurements. However, this situation is changing rapidly, particularly in the ionosphere arena. It is anticipated that within 10 years, there will be several hundred thousand ionospheric measurements per day from a variety of sources, and these data will be available for assimilation into specification and forecast models.The data sources include in situ electron densities measured by NOAA and DOD operational satellites, bottomside electron density profiles from a network of 100 digisondes, line-of-sight total electron content (TEC) measurements between as many as 1,000 ground stations and Global Positioning System satellites,TECs between low-altitude satellites with radio beacons and specific ground-based chains of stations,TECs via occultations between various low-altitude satellites and between low- and high-altitude satellites, and line-of-sight optical emission data, which can provide information on plasma densi- ties. Furthermore, the images of the plasmasphere and magnetospheric boundaries recently obtained by the IMAGE spacecraft can be used in data assimilation models. A magnetospheric constellation with more than 10 spacecraft would provide an invaluable dataset for data assimilation models. There are numerous data assimilation techniques, but perhaps the one that has gained the most prominence is the Kalman filter.This filter provides an efficient means for assimilating different data types into a time-dependent, physics- based numerical model,taking into account the uncertainties in both the model and the data. Using a sequential least- squares procedure, the Kalman filter finds the best estimate of the state (ionosphere, neutral atmosphere, or magneto- sphere) at time t based on all information prior to this time. Formally, the Kalman filter performs a recursive least-squares inversion of all of the measurements (TEC, in situ satellite, etc.) for the model variable (e.g., plasma density) using the physics in the model as a constraint.The net result is an improved estimate of the model variable; it has the least expected error, given the measurements, model, and error statistics. Use of data assimilation techniques has recently been initiated in ionosphere-thermosphere studies, and it is clear that such techniques will have a major impact on the field during the next decade. With a physics-based model of the iono- sphere assimilating hundreds of thousands of measurements per day, global ionospheric reconstructions will be available hour by hour throughout the year.With this information at hand,the scientific community should be able to resolve a host of long-standing basic science issues, and the operational community should have reliable ionospheric parameters for the various products. Eventually, a similar capability will be achieved for the study of Earth's upper atmosphere and magneto- sphere, but perhaps not during the coming decade because of the lack of data. Because the development and testing of physics-based assimilation models are labor intensive, it is important that this effort begin now for other solar and space physics domains. Finally, space plasmas, as illustrated in extraordi- nary detail by images of the Sun obtained by the space- craft TRACE, can change their (magnetic field) con- figuration on extremely short time scales, with an accompanying explosive relaxation in the associated plasma. Like plasma boundaries, the rapid evolution of a plasma from one state to another continues to chal- lenge theorists and modelers. From the above discussion, a very natural classifica- tion or ordering of solar system plasma physics, and one that distances itself from the very regional- and event- based ordering of the past several decades, is (1 ) space plasma couplings across regions, (2) couplings across scales, (3) physics of boundaries, and (4) explosive re- laxation of plasmas. In developing the high-priority theory and model i ng chal lenges I isted i n Box 4.1 1, the
202 panel focused on problems that span several categories. These are problems that are fundamental to the further development of space physics, that cut across the differ- ent panels of the Solar and Space Physics Survey Com- mittee, and that have the potential to influence both astrophysics and laboratory plasma physics. Box 4.1 1 lists the space physics areas to which the panel gives highest priority; because the panel does not believe that the individual entries can admit a meaningful ranking, they are not listed in priority order. Improving our under- standing of the problems identified in Box 4.11 will lead to major advances in the field of space physics over the next decade, and central to this advance will be the development of models and theories that embrace the highly nonlinear dynamical coupling and feedback of different and disparate scales, processes, and regions. As suggested early in this section, the notion of "coupling complexity" refers to a class of problems or systems that encompass significantly different scales, regions, or particle populations, the understanding of which requires more than one set of defining equations or concepts. As discussed above, for example, the heliosphere contains cosmic rays, the solar wind, neutral atoms, and pickup ions, each of which interacts with the others and requires its own set of equations and coupling terms. Similarly, the ionosphere-thermosphere and mag- netosphere are different regions governed by distinct physical processes. THE SUN TO THE EARTH AND BEYOND: PANEL REPORTS THE CHALLENGES OF COUPLING COMPLEXITY To embrace the demands imposed by coupling com- plexity, as defined above, in resolving the problems listed in Box 4.11, theorists, modelers, and data analysts must address a number of challenges: 1. Formulation of sophisticated models that incor- porate disparate scales, processes, and regions, the de- velopment of analytic theory, and the maintenance of a strong connection to basic science; 2. Computation; 3. Incorporation of coupling complexity into space physics models; 4. Integration of theory, modeling, and space- and ground-based observations; 5. Data exploration and assimilation; and 6. Transition of scientific models to operational sta- tus in, e.g., space weather activities. ration. Each of the above challenges requires some elabo- Challenge 1: Formulation of Sophisticated Models In recognizing that multiple scales, regions, pro- cesses, and plasma populations are intrinsic to the chal- lenging space physics problems of today, the correct Tomography has been used extensively by the medical community for several decades, but it was not until about 1 988 that this technique was first applied to the ionosphere. Ionospheric tomography is more difficult than medical tomogra- phy because the ionosphere varies with time,while a patient is generally motionless during a tomographic scan.Also,for ionospheric tomography, the scanning directions are limited. Nevertheless, to date, both radio and optical tomography have been used in ionospheric applications.With radio tomography, radio transmissions from a low-Earth-orbiting satellite (or satellites) are received along a chain of stations,with the stations typically distributed along a line.The signals received at the stations are used to measure the total electron content (TEC) along the ray paths. Each station records a large number of ray paths as the satellite traverses the station,with the pattern of ray paths taking the form of a partially opened fan. With multiple stations along a line, there are a large number of intersecting ray paths, and the associated TECs are inverted by a mathematical algorithm to obtain a two-dimensional reconstruction of the electron density as a function of altitude and distance along the chain of stations. Optical tomography works in a similar way, but instead of TECs integrated (line-of-sight) optical emissions are measured. At the present time, tomography chains exist in the United States, South America, parts of Europe (including Scandinavia), Russia, and Asia, and these chains provide information about ionospheric weather in these local regions. Although tomography chains are relatively new, they already have been used very successfully in reconstructing several different ionospheric density features, including plasma troughs, auroral boundary blobs, traveling ionospheric distur- bances, equatorial ionization crests, and equatorial plasma bubbles. During the next decade,tomography is anticipated to play an important role in elucidating ionospheric weather features.
PANEL ON THEORY, MODELING, AND DATA EXPLORATION mathematical and physical formulation is critical. In this, there is no substitute for time-honored analytical ap- proaches to theoretical developments in plasma phys- ics, fl u i d dyn am i cs, and app I fed mathematics. Progress on highly nonlinear, coupled plasma problems may be made using techniques that range from the relatively standard to nonlinear, low-order reductive approaches and statistical methods. Current agency funding pro- grams appear to be largely adequate (although subject to budgetary pressure) to support these basic theoretical efforts, although more innovation and bolder ideas, ap- proaches, and techniques in proposed research should be encouraged and rewarded. The panel cannot empha- size strongly enough that the program outlined in this report must maintain a strong and vigorous connection to basic space science. Computation is no substitute for the development of rigorous theories and well-con- ceived models. Challenge 1 must be regarded as a criti- cal component of the recommendations that are devel- oped below. Challenge 2: Computation Many of the problems listed in Box 4.11 impose significant computational demands in terms of CPU power and the concomitant development of sophisti- cated and efficient codes. Two chal lenges face the com- munity. The first is to further develop existing codes and algorithms, such as three-dimensional MHD codes that incorporate adaptive mesh refinement for example, three-dimensional hybrid codes with improved electron/ ion mass ratios or improved codes for data exploration. These problems do not demand the inclusion of new physics to handle coupling complexity but demand in- stead substantial progress in current research areas. The second challenge lies in developing and implementing new computational approaches for both model solving/ simulation and data exploration that exploit advances made by numerical mathematicians, statisticians, and computer scientists. To meet both chal lenges, the panel strongly encourages funding agencies to augment the grants of modelers for a limited period to allow them to make their software public and to provide adequate documentation and support. At the same time, the panel warns against turning the open-source (or "community code") concept into an exercise in computer science with an excessive emphasis on standards, version con- trol, interoperability, etc. Open-source codes should re- main scientific codes first and foremost. 203 Challenge 3: Incorporation of Coupling Complexity into Space Physics Models The coupling of different physical processes, scales, and regimes gives rise to the relatively new science of coupling complexity in space plasma systems. Obvi- ously space physics has always attempted to incorpo- rate as many physical processes as possible into a par- ticular model. However, only in the last few years have our understanding of the basic underlying physical pro- cesses (through the acquisition of data and advances in theory, among others) and our access to powerful com- puters been sufficient to allow space physicists to make a reasonable effort to explore and model systems as opposed to processes. The self-consistent incorporation of multiple scales, physical processes, and distinct regions into models will be the main challenge to theorists and modelers in the coming decade, demanding the formulation and devel- opment of sophisticated models and theory, the devel- opment of new and innovative algorithms, access to sophisticated computational resources, and the oppor- tunity to test model predictions and validate theories against existing and future observations. The panel an- ticipates that models and theories that address coupling complexity will demand sophisticated new measure- ments, which will in turn drive and define new space- and ground-based missions (in situ, multipoint, remote, etc.) (Box 4.1 21. The panel advocates both the synergistic investiga- tion of wel l-chosen, isolated theoretical problems and the development of coupled global models. For major advances to be made in the space physics science of coupl ing complexity, fundamental theoretical analysis, sophisticated computational tools, and state-of-the-art data analysis must all be coupled intimately under a single umbrella program. Theoreticians working with pen and paper, computational space physicists, and data analysts will be needed collectively to achieve the ma- jor advances expected of space physics. Only by creat- ing and maintaining major groups of this sort can a strong and vital connection between basic science, com- putation, and observations be achieved. What is needed to develop a research program that addresses coupling complexity are the following: · Time that is, long-term stable funding; · Synergistically interacting groups of students, postdoctoral associates, research scientists, and several university or institutional faculty who are able to inte-
204 THE SUN TO THE EARTH AND BEYOND: PANEL REPORTS The Magnetospheric Specification Model (MSM), as currently implemented by the NOAA Space Environment Center, computes fluxes of energetic electrons and ions in real time based on an estimated index of geomagnetic activity, Kp.The model provides a rough picture of present space environment conditions, in particular near geosyn- chronous orbit.The representation is imprecise, partly be- cause particles are injected into geosynchronous orbit by substorms, which are not yet well understood, and partly because driving the model with the 3-hour Kp index causes it to miss short-term variability. Figure 4.10.1 shows an example of routine MSM output in the equatorial plane, which also exemplifies the high spatial variability of par- ticle fluxes in the inner magnetosphere. The MSM was developed at Rice University and is based on the research-grade RCM (Rice Convection Model), which was greatly simplified for operational use. Numerical solutions of some differential equations were replaced by simpler, observation-driven, semiempirical al- gorithms. It is noteworthy that the development of the MSM, exclusive of its scientific basis, took about 7 years and substantial resources. 70 September 20C1' 19 UT' Kp = 1D 10 5 ~ -5 -10 Xgsm (~) giT Green time: Thy Sup 70 1 7:31:21 20~1 FIGURE 4.10.1 SOURCE: Unpublished figure courtesy of Richard Wolf, Rice University. grate diverse scientific, theoretical, and computational viewpoints and to bring the complementary tools of theoretical and analytic techniques, computational physics, and data analysis to a major research problem; · Sufficient computing and institutional resources (say, the ability to purchase and administer a Beowulf system or the ability to support computational physi- cists, theoretical physicists, and data analysts under a single umbrella program); · The opportunity to develop highly sophisticated codes and models to extend and validate theories and models and to make predictions. These will eventually be transitioned to the community and will require re- sources for software development and support; and · A commitment from the home institution (univer- sity, laboratory, industry) to develop a stable, long-term program in space physics by creating permanent posi- tions. This would provide an intellectual environment in which large research efforts can flourish and would al- low for critical-mass efforts. Are the existing resources and programs adequate? The panel has carefully examined existing research pro- grams and avenues supported by NASA, NSF, NOAA, DOE, DOD, the Air Force, the Navy, and the Army and concludes that the resources for a full-scale coupling complexity research program are inadequate or nonex- istent. This conclusion resulted from the panel's exami- nation of the exist) ng research programs at the various agencies and from direct meetings with and presenta- tions by NASA and NSF administrators. Programs such as the SECTP, MU RI, the Supporting Research and Tech- nology (SR&T) program, the HPCC program, ITR (an NSF program), and the CEDAR, GEM, and SHINE programs do not meet the five needs listed above and have goals that do not resonate well with a coupling complexity program. A clear distinction can be made between the recommended Coupling Complexity Initia- tive and the current SECTP. To simply enhance the exist- ing SECTP is infeasible since the Coupling Complexity Initiative (1) is aimed primarily at systems throughout the heliosphere rather than isolated physical processes and does not emphasize the Sun-Earth connection; (2) would bring large numbers of theorists, computa- tional modelers, and data analysts together in a syner- gistic fashion and let them work on different aspects of a complex physical system; (3) is intended to foster the growth of space physics at universities and institutes;
PANEL ON THEORY, MODELING, AND DATA EXPLORATION and (4) would have grants that are very much larger and of longer duration than those of the SECTP. Clearly, the intent of the Coupling Complexity Initiative is quite dif- ferent from the goals of the SECTP. Programs such as SECTP are simply of insufficient duration, size, and scope and do not meet many of the needs identified here. Other programs, such as CEDAIVGEM/SHINE and SR&T, are excellent but clearly of very limited scope. The challenges we will face in the coming decade from new and complicated missions (clusters, constella- tions, LOOS), the piecing together of disparate data sets and results from space- and ground-based observations, and achieving the LWS goal of synthesizing and under- standing the global coupled Sun-Earth system demand a far more sophisticated approach to theory and modeling that addresses coupling, nonlinearity, and multiscale, multiprocess, and multiregional feedback. Similarly, ex- traordinarily ambitious new missions such as Solar Probe and Interstellar Probe demand innovative theory and modeling since these missions will be unique and among the great scientific enterprises of the new cen- tury. Furthermore, coupling complexity will be of great importance to other fields, such as astrophysics and laboratory physics. 205 Recommendation 1. NASA should take the lead in creating a new research program the Coupling Com- plexity Research Initiative to address multiprocess coupling, nonlinearity, and multiscale and multiregional feedback in space physics. The research program should be peer reviewed. It should do the following: · Provide long-term, stable funding for a 5-year · ~ perlocl. ~ Provide sufficiently large grants that critical- mass-sized groups of students, postdoctoral associates, and research scientists, gathered around university and institutional faculty, can be supported. · Provide funding to support adequate computa- tional resources and infrastructure for the successfully funded groups. · Facilitate the development and delivery of community-based models. · Use the grants to leverage faculty and perma- nent positions and funding from home institutions such as universities, laboratories, institutes, and industry. The scope of a successful coupling complexity grant must recognize that a combination of remote sensing. in situ measurements, and advanced modeling will pro- L)' , .. ... Examples of high-priority space science themes that are associated with coupling complexity are numerous and embrace many of the most significant challenges in space physics.The following have been identified as excellent ex- amples of coupling complexity in that they are fundamental to the further development of space physics, they cut across the different areas addressed by the five panels of the Solar and Space Physics Survey Committee, and they have the potential to influence both astrophysics and laboratory plasma physics.The examples below all span three or more of the four classes that introduced the notion of coupling complexity (see p. 2011. Examples within solar physics are (1 ) coronal heating, which clearly requires that we address the coupling of physical . . . . . . . . processes across regions and scales as well as incorporate the explosive relaxation of plasmas; (2) coronal mass ejections and flares, which demand that all four classes be addressed; (3) the dynamo problem, which remains one of the major outstanding problems in solar physics and for which classes 1, 2, and 3 are all factors; and (4) solar variability (classes 1, 2, and 3), which is of both scientific and economic importance. Examples within heliospheric physics are (1 ) the acceleration of the solar wind and the polar wind (classes 1, 2, and 3), both of which are major outstanding theoretical problems; (2) the interaction of the solar wind with the local interstellar medium (again, classes 1,2, and 3),which is becoming a topic of increasing importance with fundamental implications for astrophysics; (3) turbulence in the interplanetary medium (classes 1 to 3), which remains as a great classical problem; and (4) transport phenomena (all four classes) for both particles and fields, which is another problem of classical origin. Examples related to the interaction of the solar wind with planets are (1) the physics of planetary ionosphere-mag- netosphere mass exchange; (2) magnetic storms; (3) substorms;and,of course (4) climate variability due to solar influences. All four examples here span all four classes. Some examples are so broad ranging that they are of importance to solar, heliospheric, magnetospheric, and iono- spheric physics. Examples include (1 ) current layers, boundaries, and shock waves (classes 1 to 41; (2) particle acceleration (also classes 1 to 41; (3) turbulence; and (4) changes in magnetic field topologies and plasma configurations. _ . _ . _
206 THE SUN TO THE EARTH AND BEYOND: PANEL REPORTS The ideas expressed in Challenge 3 are illustrated here by an example that has received recent and extensive discus- sion within the magnetospheric community. It has been recognized in the past few years that magnetohydrodynamics (MHD) cannot adequately model reconnection, so that implementing adaptive mesh refinement or related techniques to finer and finer scales is insufficient to clarify the physical processes. One approach that has been advocated is to develop methods for embedding kinetic models locally within larger-scale MHD simulations. At small scales, the MHD description fails and a new physics model (multifluid or kinetic) must be incorporated. As a preliminary approach, one can envisage using a hybrid model in a predetermined region of space in an MHD simulation. Eventually, the approach should be adaptive, so that as small scales develop in the MHD model, the grid refinement is coupled to the implementation of a higher-level physical model.This represents one aspect (global self-consistent computer simulations) of a program designed to further our understanding of reconnection. However, to completely elucidate this challenging problem will also require complementary studies that address theoretically related fundamental questions using, for example, analytical and other techniques and that might address aspects of reconnection that cannot be handled adequately within large-scale simula- tions. Similar complementary data studies are needed to both guide and refine the theoretical and computational studies. vice the major advances in space physics over the next several decades. Successful grants will address prob- lems by developing models and theories that define and drive missions, and the models will demand that obser- vations and measurements reach new levels of sophisti- cation, possibly through the use of new techniques. The models and algorithms developed through the Coupling Complexity Initiative will make invaluable contributions to future NASA, NSF, and NOAA activi- ties, which are expected to focus on remote sensing and multipoint measurements. Because processes acting within and on the various regions in the heliosphere occur at different locations and times, global models are needed to relate past events that occur in adjacent re- gions to the in situ measurements made along space- craft trajectories and to remote-sensing scan planes. Also, since the processes operate over widely different spatial and temporal scales, models are needed to inte- grate the effects so that they can be related to line-of- sight (column-integrated) ground-based or spacecraft measurements. In addition, coupled global models are usefu I for provid i ng a framework i n wh ich I arge mu Iti- spacecraft, multisite data sets can be organized and in- terpreted. Finally, hybrid models that rigorously include both macroscopic and microscopic physical processes are needed to relate measurements obtained from multispacecraft missions that focus on boundary layers, so that important acceleration processes can be eluci- dated. A fundamental component of this recommendation is that the award of a grant will carry with it the expec- tation of a commitment from the home institution (uni- versity, laboratory, industry) to develop a stable, long- term program in space physics by the creation of perma- nent positions; this provides an intellectual environment in which large, critical-mass research efforts can de- velop and flourish. Since nearly 30 groups submitted proposals to the most recent NASA SECTP, the panel used this as a mea- sure of the potential large groups that exist currently in this country. A healthy space physics program should support about a th i rd of these, i n the esti mation of the panel. Accordingly, it is recommended that the Cou- pling Complexity Initiative should support 10 groups, each at a funding level of between $500,000 to $1 mil- lion per year. This would require the commitment of $7.5 million to $1 0 million per year in funding. Many of the goals of the LWS program resonate very well with the Coupling Complexity Initiative, which might be its natural home within NASA. NSF has expressed consid- erable interest in the broad field of complexity, and the panel anticipates that the Coupling Complexity Initia- tive might also fit well into these programs. Similarly, DOE, NOAA, and possibly even the DOD have a funda- mental interest in coupling complexity. Cross-agency programs have proved successful in the past. The panel recommends the formation of a cross-agency commis- sion, with NASA possibly taking the lead though its LWS program, to examine the implementation of a cross- agency Coupling Complexity Initiative. Challenge 4: Integration of Theory, Modeling, and Observations Theoretical space physics develops models, which can be tested observationally and then refined. Ideally,
PANEL ON THEORY, MODELING, AND DATA EXPLORATION it also defines research frontiers and then drives new mission and observation concepts. Theory is therefore both proactive and reactive, for besides demanding new observations, it must meet the challenges posed by ex- isting data. The synergy that must exist between theory and experiment needs to be strengthened. The success- ful deployment of a scientific payload does not neces- sarily correspond to a successful scientific mission it is the importance of the returned data to theoretical mod- els and our ability to fully analyze, exploit, optimize, and refi ne these data that u Iti mately determi ne the suc- cess or failure of a mission. As important as instru- ments space- or ground-based, in situ or remote are to a scientifically successful mission, so too are data analysis, theory, and modeling. The panel strongly sup- ports and endorses the current NASA Guest Investigator program and would like to see it strengthened, with similar programs created in other agencies. Recommendation 2. The NASA Guest Investigator pro- gram should (1 ) be mandatory for all existing and new missions, (2) include both space- and ground-based mis- sions, (3) be initiated some 3 to 5 years before launch, and (4) be peer reviewed and competed for annually, with grant durations of up to 3 years. Funding, at a minimum equivalent to 10 percent of the instrument cost, should be assigned to the Guest Investigator pro- gram and should explicitly support scientists working on mission-related theory and data analysis. Further, the Guest Investigator program for each mission should have the same status as a mission instrument. Other agencies should also consider guest investigator initia- tives with their programs. The implementation of this recommendation would address the very real concerns expressed by many ex- perimentalists that too few theorists play an active role in exploring, interpreting, refining, and extending obser- vations returned by expensive missions. The panel notes that in an era of fast missions, a cadre of theorists and data explorers needs to be in place and already active to take full advantage of a newly launched mission. A robust Guest Investigator initiative may also address the expressed concern that NASA expects Pls to submit pro- posals with extensive science goals but does not provide sufficient funding to support the science. As set forth in recommendation 2, at least 10 per- cent of the instrument cost should be assigned to the Guest Investigator program, and this funding should be budgeted in the mission costs from the outset. The panel emphasizes that this is a bare minimum and that as 207 much as 50 percent of the instrument cost would be more reasonable. The panel also recommends that a Guest Investigator program begin a few years prior to launch. For a $160 million MlDEX-class mission, 10 per- cent of the instrument cost may amount to approxi- mately $3 million. Beginning 3 years before launch and assuming a 6-year lifetime for the mission after launch implies guest investigator funding of $1 million for every 3-year funding cycle, yielding a total of perhaps 12 guest investigators for the mission. The panel would regard 10 percent as a reasonable minimum since between 10 and 15 investigators are needed for a mission to provide a critical mass of researchers for data, theory, and model- ing investigations. Although the panel would like to see considerably more funding for the Guest Investigator program, existing data analysis lines should not be cut to meet budgetary shortfalls elsewhere. It is essential that very expensive missions, ground- or space-based, should not be undercut by inadequate funding support for the data analysis. This is particularly true of LWS data analysis funds. Challenge 5: Data Exploration and Assimilation Government-supported ground- and space-based re- search projects are currently returning an unparalleled wealth of in situ and remote space physics observations. Systematic and correlative analyses of both these obser- vations and archived observations help researchers test theories, identify new phenomena, define solar variabil- ity, and establish the relationships linking the Sun and interplanetary space to Earth's magnetosphere and iono- sphere. The observations are essential to monitor and forecast space weather. Research in space physics often requires the inter- pretation and intercomparison of large, complex, and diverse data sets derived from multiple instruments and locations. It already requires the assimilation of real- time data sets into numerical simulations and the direct comparison of observations with computer simulations (empirical and numerical). Successful exploitation ofthe increasingly large data sets expected from future projects and simulations will demand sharing and integration of archives on scales beyond current experience. The panel bel ieves, and past experience has demonstrated, that observers, Pls, and modelers are best suited to maintain- ing, documenting, and providing their own real and syn- thetic data sets, including the simulation codes, when- ever possible. On the other hand, data mining requires access to these data sets via centralized information trees
208 Ing: and common standards. Only a distributed virtual data system can provide ready access to a wide variety of well-supported, high-quality data. Recommendation 3. NASA should take the lead in con- vening a cross-agency consultative council that will as- sist in the creation of a cross-agency, distributed space physics information system (SPIS). The SPIS should link (but not duplicate) national and international data ar- chives through a suite of simple protocols designed to encourage participation of all data repositories and in- vestigator sites with minimal effort. The data environ- ment should include both observations and model data sets and may include codes used to generate the model output. The panel's definition of data sets includes simu- lation output and supporting documentation. Among other tasks, the system shou Id do the fol low- 1. Maintain a comprehensive online catalog of both distributed and centralized data sets. 2. Generate key parameter (coarse resolution) data and develop interactive Web-based tools to access and display these data sets. 3. Provide higher-resolution data; error estimates; support) ng pl atform-i ndependent portrayal and analysis software; and appropriate documentation from distrib- uted principal investigator sites. 4. Permanently archive validated high-resolution data sets and supporting documentation at designated sites and restore relevant offline data sets to online sta- tus as needed. 5. Develop and provide information concerning standard software, format, timing, coordinate system, and naming conventions. 6. Maintain a software tree containing analysis and translation tools. 7. Foster ongoing dialogues between users, data providers, program managers, and archivists, both within and across agency boundaries. 8. Maintain portals to astrophysics, planetary phys- ics, and foreign data systems. 9. Survey innovations in private business (e.g., data mining) and introduce new data technologies into space physics. 1 0. Regularly review evolving archival standards. 11. Support program managers by maintaining a reference library of current and past data management plans, reviewing proposed data management plans, and monitoring subsequent adherence. THE SUN TO THE EARTH AND BEYOND: PANEL REPORTS While a smoothly functioning data system will re- quire enhanced funding, it is essential that the data sys- tem grow at a pace consistent with actual user needs, as measured by data set requests and community input. Solutions imposed by a central authority often fail to satisfy the requirements of working scientists. The SPIS must enable community researchers themselves to iden- tify and prioritize problems, and then propose and implement practical solutions. An appropriate and cost-effective management structure would include a small supervisory office (full- time project scientist, project manager, and administra- tive assistant) that reports to program managers at each funding agency, distributes funding to the system's nodes, and ensures cross-disciplinary integration. A panel composed of community members and represen- tatives from the primary nodes should advise this super- visory office. Primary nodes within the SPIS should be organized by well-recognized scientific discipline (e.g., solar, hel iospheric, magnetospheric, and ionospheric nodes). Each primary node should employ a half-time scientist, a quarter-time administrator, and a half-time programmer. The primary nodes would sponsor a set of smaller, dynamically evolving nodes tasked with accom- plishing specific objectives, such as restoring and vali- dating data sets or developing software and translators. To ensure responsiveness to evolving needs, all func- tions within the data system, including the central office, must be competed for periodically. A similar manage- ment structure, tasked with similar functions, was pro- posed at the Space Physics Data System Commun ity Workshop held at Rice University in 19938 and has sub- sequently been endorsed at community forums and review teams. Table 4.2 provides the suggested funding profile for an SPIS. For comparison, the NSF-NASA Virtual Solar Observatory initiative requested $4.3 million to support setup costs for 10 nodes and a core team during the first year, $2.9 million during the second year, $1.7 million each year thereafter, and between $115,000 and $350,000 for initial and subsequent costs for each node added. The ambitious goals of NASA's LWS and the N S F's Space Weath er programs wi I I req u i re th e i m p I e- mentation of a sophisticated data system like that envis- aged in this recommendation, so part of the funding for ~NASA, Office of Space Science and Applications, Space Physics Division. ~ 993. SPDS Concept Document on NASA's Space Physics Data System. NASA, Washington, D.C.
PANEL ON THEORY, MODELING, AND DATA EXPLORATION TABLE 4.2 Funding Profile for a Space Physics Information System Schedule Tasks Year 1 Year2 Year 3+ Year4 Year 5+ Establish central and four discipline nodes Fund competed tasks Expand competition and services Continued expansion System fully operational 209 allow for the research group to assist in making the scientific codes operational. The RPC budgets of the Cost NOAA Space Environment Center and the Air Force (million $) Space and Missiles Center/DSMP Technology Applica- tions Division (SMC/CIT) should be augmented to fa- cilitate the timely transition of models. 1.5 3.5 5.5 7.5 10.0 SPIS should come from these programs. Pending the establishment of the SPIS, each agency might initiate or strengthen programs that devote funding to data man- agement and access issues (e.g., NASA's AISRP). How- ever, since SPIS will support research programs that cross agencies, disciplines, and methodology and will have, in addition, implications for data management in all the sciences, a broad new initiative for funding is clearly required that embraces the entire scope of the recom- mendation. Challenge 6:Transition of Scientific Models to Operational Status Despite the existence of many solar, heliospheric, and geospace models that can potentially be used for operational space weather forecasting, relatively few have so far been transitioned into operation at the NOAA and Air Force space weather centers. This is due to inad- equate resources for transition efforts, in particular at the NOAA/Space Environment Center. Recommendation 4. NOAA and the Air Force should initiate a program to support external research groups in the transitioning of their models to NOAA and Air Force rapid prototyping centers (RPCs). Program sup- port should include funding for documentation, soft- ware standardizing, software support, adaptation of codes for operational use, and validation and should Competition within the transition initiative should be open to all potential model providers, and the grant- ees should be selected by the space weather centers on a peer-reviewed basis. As noted above, precise levels, particularly for the Air Force, are difficult to determine accurately. Costs at NOAA have been estimated based on prior costs to transition models that are currently operational at NOAA, incl uding costs incurred directly by NOAA as well as by private-industry partners. The panel estimates that $1 million will be required to sup- port the design and initial implementation of a software and database infrastructure; hardware acquisition; and initial validation, visualization, and hosting of two new models. Subsequently, $500,000 per year would be re- quired to support at least three permanent NOAA staff and an additional $500,000 for operational support, real-time data links, software standardization, documen- tation, etc. Since the panel anticipates three to five codes per year being readied for transition at a cost of ~$200,000 each, an additional $1 million per year should be available for external researchers selected via open competition. A conservative annual budget is therefore between $2 mi 11 ion and $2.5 mi 11 ion. The transition initiative should be funded through a partnership between the Air Force and NOAA, and the precise levels of funding should be determined by the needs of each. The NOAA/Space Environment Center and Air Force RPC budgets should be augmented, and more support from industry and business is necessary. The successfu I i Implementation of th is recommen- dation will recognize and strengthen the ability of space physics to contribute to economic, societal, and govern- mental needs, especially where space weather and space climatology impact human activities and techno- logical systems.