This appendix, drawn largely from the analysis of this decadal survey’s Innovations: Technology, Instruments, and Data Systems Working Group, summarizes information provided to the survey committee regarding areas where innovations would have a substantial impact in solar and space physics. The most significant advances over the next decade and beyond are most likely to derive from new observational techniques; from innovative approaches to access and to dissemination, maintenance, storage, and use of data; and from more capable observational platforms.
INSTRUMENTATION DEVELOPMENT NEEDS AND EMERGING TECHNOLOGIES
Progress toward meeting the scientific and technical challenges for solar and space physics over the coming decade hinges on improved observational capabilities and novel instrumentation. During times of diminished flight rates, the most promising instrumentation concepts must be brought to spaceflight readiness so that new instruments are selectable. Recent solar and space physics missions have typically flown slightly modified versions of existing instrumentation, due to limited instrument development opportunities and to the risk-averse programmatic environment. Conversely, incremental development of selected sensors has sometimes been carried out during implementation, thus increasing risk, cost, and schedule vulnerability.
A primary reason for this is the scant and fragmented instrument development support in the heliophysics programs Supporting Research and Technology, Living With A Star, and Low Cost Access to Space. Often in competition for limited resources, instrument development proposals are not funded because they cannot promise immediate science closure, even though the developed technology would enable breakthrough measurements in a strategic sense. Also, there is no effective program to pick up after basic development and ensure technical readiness for a mission. Such development comes at a higher cost than can be currently supported. Disturbingly, the scarcity of resources for instrument development has led to a substantially smaller and weaker instrument development cadre in the solar and space physics field than is adequate.
Deliberate investment in new instrument concepts is necessary to acquire the data needed to further solar and space physics science goals, reduce mission risk, and maintain an active and innovative hardware development community. To demonstrate the need for a dedicated NASA-funded instrument and technology development program, this section describes several notional instruments that could carry out science investigations considered for this decadal survey. Several of these cut across disciplines and thus serve a variety of important goals. These are representative examples, and specifics may change over time as new technology becomes available and scientific progress occurs. The dynamic nature of the requirements underscores the need for proactive management. Adequate, strategic resources are needed for a combination of basic development, system integration, and technology readiness level (TRL)-boosting steps, the latter usually being the most costly.
Airglow Imaging in the Visible and Far Ultraviolet (AIMI and SWMI)
Earth’s upper atmosphere (30-1,000 km) plays a key role in the interaction with the magnetosphere. The lower regions (<200 km) are virtually inaccessible by in situ probes. Improved instrumentation is needed to effectively study the energy and plasma coupling by revealing the global interaction and providing altitude-specific information. Wind measurements are critical. Composition (ion/neutral) and temperature, as well as the type and energy of precipitating particles, and ionospheric conductivities can be effectively deduced from induced atmospheric emissions. This can be provided by auroral and airglow imaging in the visible and far ultraviolet. Such an instrument requires mirrors with higher-reflectance coatings than are currently available, as well as narrower-band filters and blazed gratings with high ruling densities. “Solar blindness” in the ultraviolet regime needs to be improved substantially, and basic development of such components has been completed. System integration and in-space testing are needed to boost the system TRL.
Visible and Infrared Lidar Probing of the Upper Atmosphere (AIMI and SWMI)
Optical techniques for remote sensing of the upper atmosphere are limited by the inherent line-of-sight integration. Improving lidar technology holds great promise. Like radar, lidar has three-dimensional resolving capability and is increasingly used to measure density distributions, drift speeds, and temperatures of atmospheric constituents and trace elements. Three-dimensional resolution is especially important in characterizing atmospheric waves propagating from the lower atmosphere.
Ground-based resonance lidars currently probe altitudes of 80 to 110 km, and Rayleigh lidars probe from 30 to 80 km with power apertures of 10 to 20 Wm2. To probe above 110 km requires a substantial increase in power aperture. High-frequency, high-power lasers are now available, and with a 10 by 10 array of 1 m2 telescopes, more than 1,000 times the current capability is possible. While the components exist, system integration development is needed.
A space-based helium lidar could provide Doppler measurements over the entire 250- to 750-km altitude range. From low Earth orbit, a 10-Wm2 system is sufficient to provide spatial profiles at 10-km altitude resolution. Technology development is necessary to bring such a system to spaceflight readiness, whether on the International Space Station (ISS) or as a free-flyer.
Magnetosphere-to-Ionosphere Field-Line Tracing Technology (SWMI and AIMI)
Determining how magnetosphere-ionosphere-thermosphere coupling controls system-level dynamics requires accurate observational knowledge of the magnetic connection between magnetospheric and ionospheric phenomena. One technique that is ready for a technological boost involves firing a high-energy
electron beam along the magnetic field from a spacecraft in the equatorial magnetosphere and detecting the location of the ionospheric foot point by optically imaging the airglow spot the beam produces.
This straightforward technique faces a number of technical issues. First, finding and imaging the beam spot from the ground in the presence of auroral activity is challenging. Simply increasing the power of the electron beam leads to a second challenge—preventing space charge within the beam and increased beam divergence. The third challenge is perhaps the most severe: extracting negative charge from an ungrounded spacecraft in the tenuous magnetosphere can lead to catastrophic spacecraft charging. A high-current plasma contactor may provide a solution, but electron-beam pointing will degrade. Issues of optimizing power-storage systems, accelerators, and plasma contactors for the spacecraft remain.
High-Resolution High-Cadence Infrared-Ultraviolet Imaging of the Solar Atmosphere (SHP)
Tracing the transformation of magnetic into kinetic and thermal energy in the solar atmosphere is key to understanding processes that control heating and acceleration of flows and particles, as well as large explosive phenomena with substantial impact on our space environment. Present detectors cannot adequately measure these processes, which occur on spatial and temporal scales that push the observation requirements to higher angular resolution (<0.1°) and faster cadence (<10 s). Imaging at the smallest scales with the highest time resolution requires large-format, high-speed, high-efficiency detectors (advanced charge-coupled devices, complementary metal-oxide semiconductors) and fast polarization modulators.
Design concepts exist, but implementation requires compromises between spatial and temporal capabilities. Basic development must be followed by a staged approach with system integration and TRL boosting. Synergism with ground-based development is possible in the visible spectrum.
High-Angular-Resolution Energetic Neutral Atom Imaging (SWMI and SHP)
Key magnetospheric processes associated with solar wind driving and ionospheric coupling occur from global scales down to those of individual flux tubes, requiring both effective remote sensing and detailed in situ measurements. Energetic neutral atom (ENA) imaging provides the global view, but currently with insufficient spatial resolution, which would require ≈1° angular resolution and greater sensitivity. The same advances will be sufficient for improving observations of the heliospheric boundary, for which 2-3° is adequate.
Recent developments promising breakthrough improvements include ultra-thin foils for good statistics at moderately low energies, higher-efficiency electrostatic configurations for multiple-coincidence measurements, sensor designs with intrinsic low sensitivity to visible and ultraviolet light, and improved charged-particle rejection techniques. Important development steps include a system-level imager design for an ultraviolet-blind detector, followed by TRL boosting.
Solar Flare Neutral Energetic Particle Imager (SHP)
Direct observations of neutral atoms at mega-electron-volt energies from solar flare regions provide remote information about acceleration sites at the Sun, including spatial and temporal variations of the acceleration processes (SHP3). Such particles have been observed serendipitously with the STEREO (Solar Terrestrial Relations Observatory) High-Energy Telescope/Low-Energy Telescope, which was designed to measure solar energetic ions. An optimized detector promises much greater sensitivity. This breakthrough observation opens the door to another complementary flare diagnostic technique. Substantial advances can be expected by optimizing proven techniques and applying them within the natural shield of Earth’s
magnetic field. However, to study flare events in detail requires development of dedicated sensors with intrinsic ion suppression and sufficient angular resolution.
Technologies using electrostatic deflection and collimator-based imaging exist for lower energies. The techniques must be expanded toward higher energies, and new system integration and development are needed. This instrumentation requires basic system development, followed by staged boosting of the system’s TRL.
Data from NASA’s heliophysics missions and many ground-based observatories can be obtained currently online, either directly from individual websites or through central archives such as the Solar Data Analysis Center, the Space Physics Data Facility, or the National Space Science Data Center. These data archives are also accessible through virtual observatories (VxOs), whose goals are to provide one-stop access to validated science data from many observatories, along with the necessary tools for cross-mission analysis and visualization. Access to sophisticated modeling tools is provided by repositories such as the Community Coordinated Modeling Center (CCMC). Such agency-sponsored facilities host physics-based or empirical models developed by the user community and allow users to perform their own simulations.
Significant progress has been made over the past decade in defining the fundamental components of the data environment (virtual observatories, archives, etc.) and in starting to build and integrate them. However there continues to be a dearth of tools for using and analyzing data. However, projected data requirements for new projects are not as demanding as the leap from the Solar and Heliospheric Observatory to the Solar Dynamics Observatory (SDO). New requirements can probably be met with existing technologies and software. For instance, daily generation of Advanced Technology Solar Telescope data in 2018 is estimated to be ~4 TB, about the same as the current SDO export rate. It is also noted that some segments of the research community still suffer from the lack of effective data policies enforced by sponsoring agencies.
Data systems supporting heliophysics research over the past decade have evolved from stand-alone, custom-built “stove-pipes” to distributed, interacting systems that leverage software and technologies developed by the community. Much of this welcome development has come through NASA’s Heliophysics Data Environment (HPDE) enhancement and the National Science Foundation’s (NSF’s) Directorate for Computer and Information Science and Engineering and Office of Cyberinfrastructure. Many heliophysics data sets and models are hosted at multiple data archives and modeling centers, each with different architectures and formats. And much of the work on data systems infrastructure is funded through individual principal investigator teams. This results in uncoordinated software development, unpredictable support life cycle, and data analysis tools with limited scope. Such activity also draws funds and focus away from scientific research and analysis activities, since investigators are obliged to provide data sets and analysis tools as deliverables. Unfortunately, many of the existing archives, modeling centers, and VxOs are not inter-compatible, despite significant overlap in content or access.
The current lack of coordination among data and modeling centers stems mainly from their different philosophies, emphases, formats, architectures, and purposes. One can obtain similar data sets from various nationally funded data archives as well as from VxOs. The existence of duplicative capabilities, each with significantly different purpose and implementation philosophy, provides greater, more flexible access at the cost of generating confusion about which path to follow to the data. National and international
agencies have not identified a common goal, nor have they adopted a standard approach for funding and implementing data facilities and archives.
Current modeling centers, such as the CCMC, have multiple sponsors and allow researchers to run simulations using community-provided models that cover vastly different domains, such as the solar corona, the solar wind, the radiation environment in the heliosphere and Earth’s radiation belts, and the magnetic and electric field environments of the magnetosphere and ionosphere. Although some space weather modeling groups have developed end-to-end models, often the component modules employ controversial techniques and are based on assumptions with inherent strengths and weaknesses. Only a small fraction of all models can be run interactively, and even fewer can be coupled. This makes it difficult to validate different models and to model interesting space weather events.
Future Goals and Directions
Heliophysics is poised to make a natural transition from being driven predominantly by the pursuit of basic scientific understanding of physical processes toward one that must also address more operational, application-specific needs, much like terrestrial weather forecasting. This transition requires (1) instant unfettered access to a wide array of data sets from distributed sources in a uniform, standardized format, (2) incorporation of the results of community-developed models, and (3) the ability to perform simulations interactively and to couple different models to track ongoing space weather events.
NASA has already taken the important first step in integrating many of these data sets and tools to form the HPDE. The main objective of the HPDE is to implement a distributed, integrated, flexible data environment. HPDE modeling centers should serve as a sound foundation for a future, fully integrated heliophysics data and modeling center.
The key ingredients necessary for any successful centralized data and modeling environment are (1) full involvement of data providers, (2) rapid, open access to scientifically validated data, (3) peer-reviewed data systems driven by community needs and standards, (4) coordinated, user-friendly analysis tools, (5) reliable high-performance computing facilities and data storage, (6) uniform terminology and adequate documentation describing data products and sources, (7) flexible, interoperable, and interconnected data archives, modeling centers, and VxOs, and (8) effective communication among data providers, national and international partners, and data users.
The tremendous quantity of heliophysics data that will become available in the next decade will strain the financial, personnel, hardware, and software resources available to individual scientists, teams, and even national agencies. The dramatic advances in computing and data storage technology over the past decade are likely to continue, so the cost of future data systems and modeling centers will be dominated by personnel and software development rather than securing ultrafast computing or data storage. To achieve these goals efficiently, the national agencies will need to develop a common approach for funding data facilities, archives, modeling centers, and VxOs and coordinate the development of data systems infrastructure, including the development of data systems software, data analysis tools, and training for personnel.
Opportunities in New Data Systems
Community Input to and Control of the Integrated Data Environment
A number of virtual observatory and other data identification and access tools have appeared or are under development. These efforts could be strengthened, better focused, and more efficiently managed if more user feedback were incorporated into their governance, perhaps by formalizing community oversight
of such emerging, integrated data systems in an ad hoc group such as the NASA Heliophysics Data and Computing Working Group. Interagency coordination of the data environment as a whole would benefit researchers whose efforts are funded by multiple agencies.
The information technology industry continues to generate novel technologies and capabilities faster than any federally funded, competitively sourced research program can hope to match. Agencies must be agile enough to exploit emerging technologies without investing in their original development. The best approach is to (1) focus on commercially viable technologies for which there is a demonstrated need, such as high-performance computing clusters, and (2) otherwise invest modestly in the evaluation of emerging commercial technologies through existing mission and small-scale data center activities.
NASA has funded virtual observatories and related “middleware” development. Some of these have led to useful targeted data identification and access technologies, and some are still under development. Mature capabilities should not continue to compete with research proposals for funding. A more effective approach would be for NASA and its agency partners to establish a heliophysics-wide data infrastructure, selecting the most useful efforts for stable funding and bringing other efforts to a close. Future developments can be managed through the supplemental funding mechanisms discussed in the sections “Emerging Technologies” or “Community-Based Software Tools.”
Community-Based Software Tools
In a few subdisciplines, such as solar physics, the availability of integrated open-source data reduction and analysis tools makes a significant difference in the ability of researchers to access and manipulate data. In areas where such tools are not available, immediate agency investment in community-based development would be highly productive. Where tools are already available, support to maintain and evolve them as new data sets and capabilities emerge should continue. Capabilities should expand to include data mining and assimilation in order to enable full exploitation of the large new heliophysics data sets.
The astrophysics and geophysics communities have taken the lead in adopting modern, “semantic” technologies, where machines “understand” the context and meaning of data, to enable cross-discipline data access. Promoting the development of semantic technology would enable the emerging data access capability in heliophysics to share data and knowledge with other fields.
A National Approach to Data Policies
The heliophysics data policies of the funding agencies differ or are in some cases lacking. The NSF, for instance, now requires a data management plan in all research proposals, but geosciences does not yet have a uniform data access and preservation policy. NASA Heliophysics has a well-developed data management policy, but long-term preservation of data is in a state of flux. It would be wise for the agencies to
formulate a national policy for curation of data from taxpayer-funded scientific research. For heliophysics, the Committee on Space Weather could review and monitor agency data policies.
SPACECRAFT TECHNOLOGIES AND POLICIES
Space technology has matured over the past five decades, enabling reliable access to both near-Earth space and beyond. Nonetheless, continued progress in heliophysics, carried out with robotic spacecraft, requires infusion of new technology to advance the scientific program affordably. Meeting the survey’s science goals and maintaining leadership in heliophysics requires improved spacecraft technologies, as well as appropriate new sensors and data analysis tools.
The most significant advances in heliophysics over the next decade and beyond are most likely to derive from new observational techniques in new locations. Such techniques require a synergistic combination of spacecraft capabilities, sensors, and data-processing capability. Innovation is most likely to occur in an environment that allows ready access to advanced technology in space. Here “access” refers both to the number of launch opportunities and to appropriate risk policies. Available financial resources ultimately limit all NASA robotic scientific missions, and a very significant driver is launch vehicle cost. Thus a major motivation for technology investment is to provide more scientific capability with fewer spacecraft resources. While mass is typically the primary resource, it is also intimately connected with power, propulsion, and data return capability.
Heliophysics will benefit from developments in spacecraft technologies and policies in six broad areas, based on a review of community white papers and panel reports:
1. Constellations of small (<~20 kg) spacecraft
2. Spacecraft propulsion systems
a. Solar sails
b. High-drag environment
3. Communication systems
4. Spacecraft power systems
5. Access to advanced fabrication
6. Policy—International Traffic in Arms Regulations (ITAR), risk management, and radio frequency spectrum allocation.
Items 1 through 4 have been mentioned in surveys and NASA roadmaps for the past decade or longer. Both heliophysics and planetary exploration have interest in items 3 and 4. While some commonality exists in principle for item 2, the implementations differ; for example, aerocapture and aerobraking are more useful for planetary exploration, whereas solar sails, a potentially enabling technology for a variety of heliophysics missions, are less significant for planetary exploration.
The study of the heliophysics system requires multipoint observations to develop understanding of the coupling between disparate regions—solar wind, magnetosphere, ionosphere, and thermosphere, and mesosphere—on a planetary scale and to resolve temporal and spatial ambiguities that limit scientific understanding. Most AIMI and SWMI missions require multiple spacecraft. Approximately 25 community white papers are associated with this topic, and past NASA Heliophysics roadmap concept-missions suggest constellations of 20 to 90 spacecraft.
Small satellites in the 1- to 20-kg range enable the possibility of large constellations. The utility of multipoint observations in heliophysics has been demonstrated by NASA’s Time History of Events and Macroscale Interactions during Substorms (THEMIS) mission, a constellation of five 100-kg probes. The Space Technology 5 (ST5) spacecraft, at ~25 kg each, were flown under the New Millennium program before it was terminated.
To enable future missions, it would be wise to accelerate the development of spacecraft technologies for supporting small satellites, including constellation operations and inter-spacecraft coordination. Also useful would be investigating system engineering tradeoffs for designing a large constellation of small, scientific satellites, including balancing the risk of using modern, low-power electronics in space versus spacecraft lifetime.
Heliophysics can benefit from observations of the Sun, Earth, and heliosphere from orbits requiring continuous propulsive activity to maintain or reach in a timely fashion. Vantage points above Earth’s poles, at sub-L1 locations, and at high ecliptic latitudes have unique properties that enable important observations. Six community white papers advocated science enabled by this technology, and previous strategic studies have advocated solar sails. Past heliophysics mission concepts include a solar polar imager, a stationary Earth polar observatory, upstream solar wind monitoring at a sub-L1 location, an L5 mission, Solar Sentinels, and Interstellar Probe.
Significant investments have already been made in the United States and abroad. The Japan Aerospace Exploration Agency (JAXA) and NASA carried out dedicated tests of solar sailing for primary thrust using the IKAROS spacecraft and NanoSail-D2. Without an appropriate demonstration mission that goes beyond the small-scale, heavy-sail tests to date, the possibility of using this technology in the future will remain in doubt. NASA’s Office of the Chief Technologist has suggested a technology demonstration line that could cover 75 percent of the cost of a $200 million solar sail demonstration mission. The committee strongly urges that this potential opportunity be pursued to demonstrate a 25 g m–2, ~40-m solar sail.1
Heliophysics system science requires an understanding of plasma-neutral coupling, global composition, and structure of Earth’s upper atmosphere (< 300 km altitude). Long-term in situ observations in this relatively high-drag region are required to develop scientific understanding of this coupling region between space and Earth’s upper atmosphere. Reaching this altitude requires increased performance propulsion systems, innovative ways to reduce dependence on expendables, and use of aerodynamic effects to enable satellite operations in high-drag regions. The concept of using “dipper” satellites that maneuver in and out of high-drag regions is the conventional approach, but it does not significantly increase the observational time. Capable but “disposable” (very small, very-low-cost) satellites for the exploration of high-drag regions is an approach that has yet to be studied in detail. Such observational platforms could be deployed from the ISS or another host spacecraft if a propulsion system were developed to enable the deployment and maintenance of small satellite constellations in and above these high-drag environments.
1 See Section 22.214.171.124 of NASA, In-Space Transportation Capability Portfolio, 2005.
Study of the heliophysics system requires data-intensive observations from distant vantage points or from small, resource-constrained spacecraft. As for planetary missions, optical communications could enable large data rates. It would be prudent to start development of space and ground-station communications for a swarm of small, low-power, Earth-orbiting satellites and for distant platforms at L5, a solar polar orbiter at high ecliptic latitude, or ultimately an interstellar probe.
In situ study of the outer heliosphere requires operations past the orbit of Jupiter. At such large heliocentric distances, solar power is impractical. Other spacecraft power systems are needed. This applies both to heliophysics and planetary exploration missions. Advanced Stirling Radioisotope Generators are a potential solution. There should be a sufficient supply of the radioactive isotope plutonium-238 for use in advanced spacecraft power systems, regardless of the power conversion technology employed.
The heliophysics community needs access to advanced design and fabrication techniques for new sensing elements, new instrument techniques, and the application of greater computing power to enable scientific progress throughout the field. An agency-supported center could provide valuable assistance to spacecraft teams, instrument designers, and computing groups, serving as a consultant, provider of services, or broker for government or industrial technologies useful in aerospace applications. Rapid and cost-effective creation of custom hardware for the implementation of computational algorithms is needed for advanced sensor systems and for advanced heliophysics modeling. Custom hardware for numerical simulations can exceed by orders of magnitude the speed of general computer implementations.
The experimental community must be able to design and fabricate custom analog, digital, mixed-signal, and microelectromechanical systems (MEMS) devices rapidly and cost-effectively. Even complex current technologies such as field-programmable gate arrays (FPGAs) continue to drive costs and deliveries. Broad use of these techniques requires access to both design and fabrication methodologies at reasonable cost. Access to advanced fabrication has the potential to revolutionize heliophysics sensor and spacecraft systems.
Policy Issues—ITAR, Risk Management, Frequency Spectrum
International Traffic in Arms Regulations
The United States seeks to protect its security and foreign-policy interests, in part, by actively controlling the export of goods, technologies, and services that are or may be useful for military development in other nations. “Export” is defined not simply as the sending abroad of hardware but also as the communication of related technology and know-how to foreigners in the United States and overseas.2
The International Traffic in Arms Regulations (ITAR), which controls defense trade, includes the U.S. Munitions List (USML), which specifies categories of defense articles and services covered by the regulations. In 1999, space satellites were added to the USML. However, in 2002, ITAR was amended to exempt
2 National Research Council (NRC), Space Science and the International Traffic in Arms Regulations: Summary of a Workshop, The National Academies Press, Washington, D.C., 2008.
U.S. universities from having to obtain ITAR licenses when performing fundamental research involving foreign countries and/or persons. Because universities often collaborate with foreign partners in research and teach or employ foreign graduate students and other researchers, ITAR has a substantial effect on university activities in the space sector. Many university activities are considered to be fundamental research and thus are excluded from ITAR control; however, academic regulatory-compliance administrators and researchers alike still encounter problems with space-related activities because of the narrow and somewhat ambiguous conditions that enable research to be considered “fundamental” and therefore excluded from licensing under ITAR.3
The use of technology developed for commercial purposes in spacecraft systems and science instrumentation has enormous potential for advancing science. However, space-qualified electronics are regulated under ITAR, so commercial developers avoid ITAR restrictions by avoiding dual-use recognition. ITAR “deemed export” rules limit the exchange of technical information on instruments and spacecraft technologies, thus impeding the development of scientific tools. Uncertainty stifles collaboration between domestic industry and research universities and discourages opportunities for progress involving U.S. and foreign scientists and students. Obstacles to partnership have driven Europe, Japan, and other nations to push ahead with their own small-size missions that foster training of non-U.S. engineers and scientists.
Programmatic Risk Management
Policies for assessing, managing, mitigating, or even futilely attempting to eliminate risk should be regularly updated so as not to impede technological development and scientific advancement. Continual evaluation of risk polices is needed, including those in NASA Procedural Requirement (NPR) 7120.5x, and appropriate application must be made for large, high-profile versus small, low-cost programs. Major drivers of program risk, such as uncertain funding profiles or delays in launch services, are often external to a project.
Frequency allocation policies for satellites and the congestion of current bands available for space research favor high-profile missions. This can disadvantage typically smaller heliophysics science missions and those that make use of secondary launch opportunities. Current frequency licensing and allocation policies also require knowledge of orbital parameters long before launch; this limits the opportunistic pairing of small satellites in containerized deployment systems with launch opportunities when orbital parameters are not known in advance. It would be helpful if NASA would engage proactively with the International Telecommunication Union.
3 NRC, Space Science and the International Traffic in Arms Regulations, 2008.