Report of the Panel on Electromagnetic Observations from Space
NASA’s support of astrophysics research is an essential element in the world-class accomplishments of U.S. astronomers in their exploration of the cosmos. In addition, as was aptly expressed in the first finding of the congressionally requested National Research Council (NRC) study An Enabling Foundation for NASA’s Earth and Space Science Missions, “The mission-enabling activities in SMD [NASA’s Science Mission Directorate]—including support for scientific research and research infrastructure, advanced technology development, and scientific and technical workforce development—are fundamentally important to NASA and to the nation.”1
The Astro2010 Program Prioritization Panel on Electromagnetic Observations from Space (the EOS Panel) reviewed current astrophysics activities supported primarily by NASA’s Science Mission Directorate—specifically, those activities requiring electromagnetic observations from space as distinct from observations of particles or gravitational waves. The charge of the panel was to study possible future activities and to recommend to the Astro2010 Survey Committee a scientifically compelling, balanced, affordable, and relatively low-risk program for the 2010-2020 decade.
Guided by the science opportunities identified in the reports of the decadal survey’s five Science Frontiers Panels (Chapters 1 through 5 in this volume) and within the framework of current and in-process facilities and programs available to the astrophysics community, the panel formulated the program described below for electromagnetic space missions for the 2010-2020 decade. In the process of formulating this program, the panel reviewed nearly 100 written submissions from the astronomy and astrophysics community describing a broad range of potential facilities, required tools, and needed technology development, as well as thought-provoking manifestos on process and principles.
The program recommended by the panel reflects its judgment that, in the 2010-2020 decade—with many scientifically compelling space missions to choose from but with a tightly constrained budget—the highest priority is for programs that will have a major impact on many of the most important scientific questions, engaging a broad segment of the research community.
The panel’s recommended program is divided into large activities and moderate/small activities. The panel expresses emphatic support for a balanced program that includes both. The three large initiatives—the Wide-Field Infrared Survey Telescope (WFIRST) Observatory mission, the International X-ray Observatory (IXO) mission, and an exoplanet mission—are presented in prioritized order.
The four moderate/small activates are not prioritized. The panel’s recommended program calls for strong support of all four activities, although one—the Space Infrared Telescope for Cosmology and Astrophysics and the Background-Limited Infrared-Submillimeter Spectrograph (SPICA/BLISS)—has de facto priority because of its time-critical nature. The moderate/small initiatives are the SPICA/BLISS initiative, augmentation of NASA’s Explorer program for astrophysics, technology development for a Hubble Space Telescope (HST) successor, and augmentation of NASA research and analysis (R&A) programs in technology development and suborbital science. The relative levels of support for these activities would depend on factors that cannot be forecast in detail, such as (1) the future funding level of NASA’s Astrophysics Division base budget and (2) science opportunities and cost-benefit trade-offs. In the final section (“Funding a Balanced Program”) of this report the panel recommends funding levels across the program that address these issues for three different budget projections for the Astrophysics Division.
Wide-Field Infrared Survey Telescope
The WFIRST Observatory is a 1.5-m telescope for near-infrared (IR) imaging and low-resolution spectroscopy. The panel adopted the spacecraft hardware of
the Joint Dark Energy Mission (JDEM)/Omega mission as proposed to NASA and the Department of Energy (DOE) and substantially broadened the program for this facility. In addition to two dedicated core programs—cosmic acceleration and microlensing planet finding—WFIRST would make large-area surveys of distant galaxies and the Milky Way galaxy, study stellar populations in nearby galaxies, and offer a guest observer program advancing a broad range of astrophysical research topics.
International X-ray Observatory
The IXO mission, a proposed collaboration of NASA, the European Space Agency (ESA), and the Japan Aerospace Exploration Agency (JAXA), will revolutionize X-ray astronomy with its large-aperture and energy-resolving imager. IXO will explore the role of feedback in galaxy evolution by connecting energetic processes within galaxies with the physical state and chemical composition of hot gas around and between galaxies and within galaxy clusters and groups. Time-resolved, high-resolution spectroscopy with IXO will probe the physics of neutron stars and black holes. IXO will measure the evolution of large-scale structure with a dynamic range and detail never before possible.
One of the fastest-growing fields in astrophysics is the study of planets beyond our solar system. NASA’s current Kepler and this panel’s recommended WFIRST mission will advance knowedge of the demographics of other planetary systems, but further steps will have to be taken to investigate the properties of individual planets around nearby stars. A micro-arcsecond astrometry mission such as the Space Interferometry Mission (SIM) Lite could detect nearby systems of planets and measure their masses. SIM Lite could even detect Earth-like planets, which are particularly difficult to find, that would be near enough to allow detailed study in more ambitious, future spectroscopic missions. Alternatively, rapid advances in starlight-suppression techniques could enable a moderate-size facility that could image and characterize giant planets (and perhaps some smaller ones) and investigate the debris and dust disks that are stages in the planet-forming process. Discovering even smaller planets and studying their atmospheres with transit photometry and spectroscopy employ another powerful, rapidly improving technique. The panel urges increased technology development for these techniques and recommends that one of these missions, or a yet-to-be-developed approach, be competitively selected around mid-decade and, if the budget permits, started before the end of the decade.
Background-Limited Infrared-Submillimeter Spectrograph—U.S. Collaboration on the JAXA-ESA SPICA Mission
The tremendous success of the Spitzer Space Telescope has spurred the development of a yet-more-powerful far-IR mission, the Japanese-led Space Infrared Telescope for Cosmology and Astrophysics. The U.S. community should join this project by making the crucial contribution of a high-sensitivity spectrograph covering far-IR to submillimeter wavelengths, capitalizing on U.S. expertise and experience in detectors and instruments of this kind. Joining SPICA is time-critical and needs to be a priority. Such participation would provide cost-effective access to this advanced facility for the U.S. research community. Because JAXA and ESA are currently moving ahead with SPICA, the panel recommends that NASA commit to participation and begin to fund this activity now.
Augmenting the Explorer Program for Astrophysics
NASA’s Explorer program is arguably the best value in the space astrophysics program. After years of reduced funding, increased support for astrophysics Explorers is essential to a balanced program of research and development (R&D). The panel recommends a substantial augmentation of funding dedicated to astrophysics Explorers with the goal of returning to a flight rate of one Explorer per year by the end of the decade.
Technology Development for a Hubble Successor
The imperative of understanding the history of the “missing baryons,” as well as the evolution of stars and galaxies, requires ultraviolet (UV) spectroscopic observations that are more sensitive, and at shorter wavelengths, than are possible with the new Cosmic Origins Spectrograph (COS) on the HST. Key advances could be made with a telescope no larger than Hubble but equipped with high-efficiency UV and optical cameras having greater areal coverage than Hubble’s. These would support a very broad range of studies. Achieving these same capabilities with a 4-m or larger aperture, in combination with an exoplanet mission capable of finding and characterizing Earth-like worlds, is a compelling vision that requires further technology development. The panel recommends a dedicated program of major investments in several essential technologies to prepare for what could be the top priority in astrophysics for the 2021-2030 decade.
Augmenting Research and Analysis Programs in Technology Development and Suborbital Science
The NASA Research and Analysis (R&A) programs support diverse activities that are crucial to the astrophysics program. This includes research grants in both observation and theory, as well as for laboratory astrophysics, technology development, and the Suborbital program. The panel, recognizing that these are core activities that underlie the NASA astrophysics program, recommends as urgent the augmentation of R&A funding that targets technology development and the Suborbital program. The panel calls for (1) a new initiative of focused technology development for projects that are likely to move ahead in the 2021-2030 decade, (2) a more aggressive program of technology development for missions in their conceptual phase, and (3) greater support for the most promising, possibly transformational, ideas that are not necessarily tied to a particular mission. The panel also recommends an augmentation of the Suborbital program, which also plays a critical role in developing and testing new technologies while providing a nearly space-like environment for low-cost science and—crucially—the training of new instrumentalists.
CONTEXT FOR ELECTROMAGNETIC OBSERVATIONS FROM SPACE IN THE 2010-2020 DECADE
An Impressive Suite of Current Missions
The remarkable advances in astronomy, astrophysics, cosmology, and fundamental physics during the past few decades have been achieved to a considerable extent through a broad array of space facilities that cover much of the electromagnetic spectrum. Scientists currently have access to 15 operating space missions (Figure 6.1). Most are NASA-led missions, but some—for example, Planck, Herschel, and Astro-H—are international collaborations in which NASA is a partner. The crucial roles that these facilities play in advancing the field are discussed in Part I of this volume, the reports of the Science Frontiers Panels of the Astro2010 survey. Although a review of that material is beyond the scope of this panel’s report, their contributions, through the scientific priorities identified in the SFP reports, have guided the thinking and the recommendations of the EOS Panel.
This broad and powerful suite of capabilities will not continue: most of these missions will reach the ends of their planned lifetimes early in this decade. An important issue early in the decade will be the choice to extend some of these missions that will be made in the NASA Astrophysics Division’s senior review process. The astrophysics community can and should play a role in weighing the scientific
productivity of each mission against the potential of the initiatives recommended in this panel’s report.
New and Developing Missions for the Next Decade
The Next Generation of Broadly Capable Observatories
NASA’s four Great Observatories—Compton, Hubble, Chandra, and Spitzer—are becoming legends in the history of science. Covering the majority of the electromagnetic spectrum—from gamma rays to the far-IR—these space telescopes have greatly increased our ability to learn what happens in the universe as well as how and why it happens. Not only by providing windows to light that does not reach the ground, but also by offering exquisite spatial resolution and a dark sky, these facilities have been our guides for astronomy’s greatest adventures.
A second generation of highly capable, broad-purpose observatories is emerging. They are Fermi, the James Webb Space Telescope (JWST), a proposed new X-ray Observatory, and future large-aperture, UV-optical and far-IR telescopes. To realize all will take decades, but the first—the Fermi Gamma-ray Space Telescope (formerly GLAST)—is already up, surveying the full sky every 3 hours since August 2008. Fermi’s Large Area Telescope (LAT) covers 20 MeV to ~300 GeV, and its gamma-ray-burst monitor is sensitive over the range 8 keV to 40 MeV. Fermi has already detected active galaxies, pulsars, supernova remnants, compact binaries, globular clusters, and gamma-ray bursts (GRBs). New classes of gamma-ray sources have also been discovered: starburst galaxies, high-mass X-ray binaries, and new varieties of gamma-ray-emitting pulsars. Fermi has also made important measurements of the galactic diffuse radiation and precise measurements of the high-energy spectrum of cosmic-ray electrons and positrons. Fermi is a highly successful example of interagency cooperation—NASA and DOE—that includes international partners as well, a matter of relevance to the panel’s first-priority recommendation.
The next such observatory will be JWST, the top priority of the previous decadal survey, Astronomy and Astrophysics in the New Millennium (AANM).2 Without a doubt, JWST will be the big space mission of the next decade, independent of the program advanced by the Astro2010 survey. JWST is a broadly capable observatory in the mold of Hubble, a collaborative project led by NASA, with major participation by the European Space Agency (ESA) and the Canadian Space Agency. JWST will cover a broad range of visible-to-mid-IR wavelengths, 0.6 μm to 28 μm, and will provide orders-of-magnitude greater sensitivity than previous IR space
telescopes and with 10 times better angular resolution—comparable to Hubble’s resolution in visible light. Its large 6.5-m segmented primary mirror will view deep space from the Earth-Sun L2 Lagrange point—a million miles from Earth—where the telescope will cool to a frigid 50 K, key for high sensitivity at long wavelengths.
JWST’s prime mission is to open the last frontier in galaxy evolution—the earliest generations of stars and the birth of galaxies in the first billion years of cosmic history—and to follow the growth and maturation of galaxies through to the modern era. It will also be a powerful platform for studying how stars and their systems of planets are born in the Milky Way galaxy, taking crucial steps to probe the origin of life itself.
The U.S./German Stratospheric Observatory for Infrared Astronomy (SOFIA), a 2.4-m telescope peering into space from the side door of a Boeing 747 aircraft, began science operations in 2010. SOFIA’s near- to far-IR high-resolution spectroscopic observations are likely to have the greatest scientific impact, but its imaging instruments have unique capabilities as well. SOFIA offers 20 years of operations above the absorbing effects of Earth’s atmosphere, flexible operations from different locations over the globe to access both sky hemispheres, and the capability of updating and replacing instruments as technologies improve.
Not all of this panel’s recommended missions fit into the “broadly capable observatory” category—for example, WFIRST (which will devote most of its lifetime on two dedicated programs and surveys with broad application) and an evolving exoplanet mission do not—but one that certainly does is IXO, a revolutionary X-ray telescope proposed as a collaboration of NASA, ESA, and JAXA. The panel also recommends U.S. participation in the Japanese-led SPICA mission, a large, cold telescope for sensitive far-IR observations of primeval galaxies and of disks where planets form. As a successor to the Spitzer Space Telescope, SPICA is another example of a space observatory that will have substantial impact on a wide range of astrophysics questions. The panel also recommends a dedicated technology-development program that could lead to a large UV-optical telescope—a successor to Hubble—in the 2021-2030 decade. The panel views the 2010-2020 decade as a time of great opportunity to capitalize on the Great Observatories program and leverage its success to this next generation.
Smaller Missions in Development for the Next Decade
Much astrophysical research requires ambitious, technologically sophisticated, and complex missions like Hubble and JWST. However, progress depends equally on the ingenuity of scientists and the success of parts of the program that emphasize specific objectives, specialized capabilities, and more flexible and rapidly evolving science programs.
The jewel of this approach is the Explorer program. Two decades of astrophys-
ics Explorers have compiled a stunning record of achievement. In this tradition, the Wide-field Infrared Survey Explorer (WISE), launched successfully in December 2009, has already begun an all-sky survey from 3 to 25 μm that will be hundreds of times more sensitive than that of the Infrared Astronomical Satellite (IRAS) and nearly two orders of magnitude deeper than the JAXA mission Akari. The WISE survey will help search for the origins of planets, stars, and galaxies and create an infrared atlas whose legacy will endure for decades. The Nuclear Spectroscopic Telescope Array (NuSTAR), now in development, will be the first mission to focus high-energy X-rays, pioneering sensitive studies of the “hard” X-ray sky. Scheduled for launch in 2011, NuSTAR will search for black holes, map supernova explosions, and study the most extreme active galactic nuclei (AGN). NASA recently selected for a formulation-phase study the Gravity and Extreme Magnetism Small Explorer (GEMS), a mission that could measure polarization of cosmic X-ray sources and provide unique evidence of a black hole and data on its spin.
The United States is also a major participant in the Japanese Astro-H mission, a moderate-aperture X-ray telescope, the novel feature of which will be its relatively high sensitivity to moderately hard X-rays, E >10 keV. Among its important instruments will be an imaging array of microcalorimeter spectrometers, built by a U.S. team. Astro-H is expected to give a preview of the revolutionary capabilities of the more ambitious IXO mission.
Table 6.1 links this discussion of new and developing missions to recommendations of the two preceeding National Research Council decadal surveys: Astronomy and Astrophysics in the New Millennium (AANM; Taylor-McKee)3 and The Decade of Discovery in Astronomy and Astrophysics (Bahcall).4
SCIENCE DRIVERS FOR KEY NEW FACILITIES
The Science Frontiers Panels (SFPs) of the Astro2010 survey have posed key questions that will require new observational capabilities. In Table 6.2 the panel lists these questions and correlates them with the program of activities recommended by the EOS Panel.
The Cosmology and Fundamental Physics (CFP) and the Planetary Systems and Star Formation (PSF) SFPs shine their spotlights on what are arguably the most exciting two areas in astrophysics: the study of the extraordinary acceleration of our expanding universe, and exoplanets—expanding the search for and study of the rapidly growing population of planets known to orbit stars other than the Sun.
TABLE 6.1 Status of Previously Recommended Space Astrophysics Programs
Current Status and EOS Panel’s Recommendation
2001 AANMa Recommended
Next Generation Space Telescope (NGST)
Renamed James Webb Space Telescope (JWST), in development for a scheduled launch in 2014.
Reconfigured as International X-ray Observatory (IXO). EOS Panel recommends for this decade (pending ESA selection).
Terrestrial Planet Finder (TPF)
Several concepts studied and considerable technology development, but not ready for this decade. EOS Panel recommends further technology development.
Single Aperture Far-Infrared Telescope (SAFIR)
Not ready. EOS Panel recommends further technology development and contribution of far-infrared/submillimeter spectrometer to JAXA-led SPICA mission for this decade.
Gamma-ray Large Area Space Telescope (GLAST)
Developed jointly by NASA, DOE, and foreign partners. Renamed Fermi. Began operations in August 2008.
Laser Interferometer Space Antenna (LISA)
Not reviewed by EOS Panel. Reviewed by Astro2010 Panel on Particle Astrophysics and Gravitation
Energetic X-ray Imaging Survey Telescope (EXIST)
Expanded mission proposed to Astro2010 survey in “Major Category.” EOS Panel judged the science of insufficient priority to justify high cost and schedule risk as determined by the Astro2010 independent cost appraisal and technical evaluation process. Not recommended by EOS Panel.
R&A budgets cut over past decade. EOS Panel recommends budget augmentation for R&A programs, including technology development, theory, laboratory astrophysics, and the Suborbital program.
Ultralong-Duration Balloon program and Sounding Rocket to Orbit program
EOS Panel acknowledges great scientific potential of these longer-duration programs beyond the present Suborbital program and suggests possible payload support from Missions of Opportunity or Explorer lines.
1991 Bahcall Surveyb Recommended
Stratospheric Observatory for Infrared Astronomy (SOFIA)
Under development by NASA and German Aerospace Center, DLR Telescope installed and first door-fully-open flight successful. First science in 2010.
Space Interferometry Mission (SIM)
Re-endorsed by AANM. Reconfigured and proposed as SIM Lite Astrometric Observatory to Astro2010. EOS Panel recommends as a candidate for exoplanet mission with possible start late in the decade.
aNational Research Council, Astronomy and Astrophysics in the New Millennium, National Academy Press, Washington, D.C., 2001.
bNational Research Council, The Decade of Discovery in Astronomy and Astrophysics, National Academy Press, Washington, D.C., 1991.
NOTE: Darker color indicates a strong impact of the facility on answering the question. “Exoplanet” entries correlate the PSF questions with several proposed missions, as described in the panel report text. The maroon squares under “UV-optical telescope” refer to a possible planet-finding and characterization capability.
The EOS Panel assigned highest priority to a major new facility that will address both of these exciting fields along with many other SFP questions. The Wide-Field Infrared Survey Telescope, WFIRST, is a modest-aperture, near-IR telescope—the JDEM/Omega design—that will make precise measurements of the universe’s evolving geometry and structure over cosmic time. Using three powerful methods, WFIRST will lead the effort to unravel a great astrophysics mystery of our time—why is the universe not just expanding, but expanding at an accelerating rate? Is this cosmic acceleration due to a strange, previously unknown “dark energy” that is defeating the pull of gravity on the vast scale of the universe, or is it possible that we have discovered that Einstein’s formulation of the law of gravity is not quite right? WFIRST’s cosmology experiments will help us find out.
With the same instrumentation, WFIRST will search for planets around distant stars—from giant planets down to those smaller than Earth—using the technique of microlensing, a consequence of Einstein’s general theory of relativity. Combining the results of NASA’s ongoing Kepler mission with the WFIRST planet search will produce the first comprehensive survey of other worlds and begin to answer the fundamental questions posed by the PSF Panel: How diverse are planetary systems and how often are they like our own solar system? How common are small, rocky worlds like Earth, and how often are they found in orbits that lead to temperate conditions favorable for life?
WFIRST will also carry out near-IR surveys of extensive areas of the sky, including the plane of the Milky Way galaxy, and support guest observer programs of pointed observations, for example studying the stellar populations of individual nearby galaxies. Such programs will be crucial for progress in the study of galaxy evolution and the development of large-scale structure in the universe, and for understanding how our galaxy and its neighbors were assembled and grew. WFIRST will do both focused and broad scientific programs at a cost substantially less than several focused programs would require.
Another critical subject that is almost untouched, for lack of resources and need of further technological advances, is the exploration of the vast store of “normal” atomic matter—to astrophysicists, baryonic matter. Galaxies, stars, planets, and people make up some fraction of this, but most baryonic matter is between the galaxies, its location and condition largely unknown. The connection between “missing baryons” and the known baryons threads through the reports of the Galaxies Across Cosmic Time (GCT) SFP and the Galactic Neighborhood (GAN) SFP and is a major component in the key science questions they pose.
Astronomy was born and has matured in the glow of visible light, but now it is known that most baryons exist as gas that emits essentially no visible light at all. In fact, from cosmological measurements and the study of the abundances of chemical elements, it is known that these “invisible” baryons outnumber the visible ones by 10 to 1. But little has been learned about them, progress so far only showing where, and at what temperature, the missing baryons are not. They are not cold gas, which would be “seen” in radio or millimeter waves. They must be hot—most, very hot. Why is it important to know about this plasma (ionized gas) sea made of electrons, protons, and helium nuclei, loitering around the outskirts of galaxies, and between them? Because this vast reservoir of baryonic matter is a key to learning how and why the universe formed stars and galaxies. We want to understand how what was once only hot gas grew into the complexity of matter seen today—up to and including life itself.
We are well along in, and well equipped for, studying the baryons in stars and galaxies: JWST, Spitzer, Hubble, Herschel, SOFIA, and WISE focus on the cooler, lumpier parts of the universe—such as planets and forming and dying stars—while Chandra, XMM, and Fermi observe hotter baryons in a menagerie of exotic forms—neutron stars, black holes, quasars, pulsars, supernovae, starbursts, novae, and so on. But new facilities are essential for understanding the role played by the invisible baryons and how they continue to interact and influence what happens in the universe. The panel agrees with the conclusion of the GCT and GAN SFPs that high-resolution UV and X-ray spectroscopy are the only efficient tools for observing these baryons and studying their dynamic flows into and out of galaxies, a process that profoundly affects galaxy evolution.
Both are essential: X-rays probe the hottest gas between galaxies and within galaxy clusters, while UV allows much higher spectral and spatial resolution for measuring the physical state of the gas, chemical element abundances, and micro-physics. The hottest plasmas can be observed only in X-rays, whereas warm gas—a transition between the cold gas in galaxies and the hot plasma between them—requires the much higher resolution afforded by UV spectrographs. A complete picture of galaxy “ecosystems,” for which there are many theoretical studies but few observational constraints, requires a combination of both techniques.
The panel thus recommends as its second priority IXO, the International X-ray Observatory, a mission that will make giant strides in studies of the hot baryonic
component of the universe as well as providing unprecedented capabilities for studying high-energy phenomena such as black holes and supernovae. IXO will collect much more light than its predecessors and carry a revolutionary micro-calorimeter—a unique and powerful instrument that produces X-ray images and high-resolution spectra simultaneously. This information is exactly what is needed to explore the interplay of the baryonic matter in stars with the hot baryonic matter beyond—for example, the supernova explosions and energy eruptions from accreting massive black holes (active galactic nuclei, AGN) that drive gas flows to the outskirts of galaxies. IXO is a broad-purpose astrophysical observatory that will make giant strides in this subject and in many others.
The Hubble, with its new COS, will make the most sensitive UV search to date for hot gas around and between the galaxies, but a new-technology UV-optical telescope will be needed to go beyond what HST can do. The panel recommends an aggressive program of technology development to develop such a facility. It is quite possible that this UV-optical capability can be combined with a major exoplanet mission: a telescope that can detect directly small planets like Earth orbiting nearby stars, study what they are like, and possibly even find evidence of life on them. The EOS Panel’s recommendation includes technology development for this possibility—in particular, to show whether the two capabilities can be successfully combined.
The search for and study of exoplanets is one of the most exciting and fastest growing fields in astrophysics. Hundreds of planets have been detected by their gravitational “tugs” on parent stars, and progress has been made in the difficult job of describing what some of these planets are like, another priority for the PSF Panel. Kepler and WFIRST will compile basic data for thousands of exoplanets for a statistical study of the diversity of planetary systems, but most will be too distant for detailed study—the next step in this exciting field. The PSF Panel gives special priority as a “discovery area” to the next step of characterizing planets around nearby stars, with the goal of finding planets like Earth.
The panel strongly endorses this goal and recommends as its third priority an exoplanet mission to learn more about the planets of nearby stars. There are already several alternatives for this mission. SIM Lite, a micro-arcsecond astrometry mission that could detect the “wobble” of nearby stars induced by their planets, could discover nearby worlds as small as Earth and offer a unique program of general astrophysics based on precision distance/motion measurements. Earth-like worlds found with SIM Lite would be targets for more ambitious future missions to study their atmospheres and search for evidence of life. At a somewhat smaller scale, a moderate-sized telescope with excellent starlight suppression could directly detect and take spectra of giant planets, and perhaps even some supersized Earth-like planets. Probing the dusty disks around stars that are in the process of forming planets, and measuring the amount of dust left over in mature systems—data
needed to build a mission capable of detecting a true “Earth-twin”—are principal tasks for such a facility. Another rapidly advancing approach is to search for planets that transit nearby stars and, once they are found, to obtain precision spectroscopy during transits in order to study a planet’s atmosphere by the absorption of starlight. An exoplanet mission could not begin until late in the decade, so as its third priority, the panel recommends a competition between these and other possible approaches that are likely to develop over the next 5 to 10 years.
The Spitzer Space Telescope made remarkable progress in studies of galaxy evolution and star formation with its astonishing sensitivity in the far infrared. Spitzer has shown that dust-laden galaxies glowing brightly in the infrared, powered by starbursts and accreting black holes, become increasingly important elements in galaxy evolution as we probe further back in time toward the big bang. Spitzer also made significant breakthroughs in studying the cool, dusty environments where stars are born, as well as the disks around young stars, which are planet nurseries. To build on this critical work, the panel recommends—as its highest priority for a modest-scale investment—U.S. participation in SPICA, the JAXA-led successor to Spitzer, which will make far-IR imaging and spectroscopic observations with even greater sensitivity and spatial resolution. Exploiting U.S. leadership in detector technologies to build a background-limited, far-IR/submillimeter spectrometer for SPICA will secure a cost-effective U.S. share of this world-class facility for U.S. astronomers.
The study of the cosmic microwave background (CMB)—the light left from the big bang—has revolutionized the field of cosmology. The past decade’s combination of ground-based, balloon-borne, and satellite facilities has pushed CMB temperature measurements to what had once been unthinkable precision. These data have provided the basis for highly accurate methods of measuring the size, age, mass-energy content, and expansion history of the universe and for detecting the earliest signs of structures that led to stars and galaxies. The state of the art has been achieved with the Wilkinson Microwave Anisotropy Probe, WMAP, a stunning example of how comparatively inexpensive Explorer missions can make giant steps in forefront fields of astrophysics. The Explorer program also builds critical relationships among university groups, national laboratories, and aerospace companies, where the next generation of instrumentalists, engineers, and managers are trained. To ensure that NASA’s Explorer program continues its remarkable record of highly cost-effective, cutting-edge science, the panel proposes a substantial funding augmentation for astrophysics missions of the Explorer program.
NASA’s collaboration with ESA on the newly launched Planck mission will allow U.S. astrophysicists to take the next step in CMB research. Planck will map with even greater precision than WMAP the CMB anisotropies and possibly CMB polarization, a powerful diagnostic of early-universe physics. While the U.S. community prepares to analyze Planck data, it continues to develop better technology
for an even more powerful mission aimed at polarization measurements. Balloon-borne payloads—part of the NASA Suborbital program—are a key component in the invention of new detectors and methods, allowing low-cost development of components for future satellite missions. The EOS Panel recommends an augmentation of the Suborbital program, together with technology development to support CMB and other research programs that depend on invention, innovation, and experimentation. These activities deliver cutting-edge science while providing irreplaceable hands-on experience with technologies that are often destined for large space missions, training the next generation of instrument builders in the process. They are essential to the health of the NASA astrophysics program.
WIDE-FIELD INFRARED SURVEY TELESCOPE—WFIRST
Remarkable opportunities for new space initiatives are at hand. However, resources in the decade 2010-2020 could be substantially less than in the previous two decades. Although excellent science is always the first priority, the panel believes that new space initiatives encourage activities that will have a major impact on a wide range of high-priority research questions, engaging a broad segment of the community. It was accordingly with considerable excitement that the panel recognized in the input received from the U.S. astronomical community several strong scientific programs describing essentially the same facility—a 1.5-m space telescope with a focal plane covered with near-IR array detectors. Because of the absorption and emission of Earth’s atmosphere in the near-infrared, a modest-size space telescope can have great impact on many subjects and complement and leverage observations from ground-based facilities. Furthermore, the ~3 times better spatial resolution, compared to the best ground-based (non-adaptive optics-corrected) seeing, offers tremendous gains in sensitivity that are clearly “enabling” for many important programs.
Accordingly, the EOS Panel recommends as its highest priority a facility it calls WFIRST—the Wide-Field Infrared Survey Telescope. The panel chose one particularly well-studied concept—Joint Dark Energy Mission/Omega (JDEM/Omega)—as a “hardware template” for the WFIRST mission. WFIRST will first and foremost tackle two of the biggest questions in astrophysics: Why is the universe accelerating? (SFP science question CFP 2) and, Do habitable worlds exist around other stars, and can we identify the telltale signs of life on an exoplanet? (SFP science question PSF 4). The first 5 years of a 10-year mission would be dedicated largely to answering these questions. The cosmic acceleration research program will make unique measurements of the dark-energy equation-of-state parameter “w” and its time evolution, , while also addressing whether cosmic acceleration might be due instead to an imperfect understanding of gravity. In addition, an unprecedentedly powerful extrasolar planet search will be carried out by monitoring a
large sample of stars in the central bulge of the Milky Way galaxy for microlensing events, producing a demographic survey that will sensitively sample a wide range of planet masses—down to planets smaller than Earth—and of distances from the parent stars.
A significant fraction of the first 5 years will also be used for surveys and smaller peer-reviewed guest-observer projects that will investigate, for example, galaxy evolution, stellar populations of nearby galaxies, and the plane of the Milky Way galaxy. The combination of depth, area, and quality of WFIRST data in the infrared will easily surpass that of any other ground-based or space-based facility. WFIRST research bears substantially on 10 of the 20 key questions posed in the Astro2010 Science Frontiers Panel reports (see Table 6.2). The WFIRST mission will contribute to answering some of these questions through extensive surveys. A deep infrared galaxy survey will cover the key epoch of galaxy assembly, 1 < z < 2, with multiband imaging and low-resolution (grism) spectroscopy (redshifts and line strengths) via Hα (hydrogen emission line) observations. A complete survey of the galactic plane in the near-infrared will also be undertaken. As the mission unfolds, especially as the cosmic acceleration and planet-finding programs complete their first campaigns, these surveys—made immediately available to the entire community—and pointed observations through guest observer (GO) programs, will assume more importance. Peer review will determine the balance between these elements and the continuation of the cosmology and planet-finding programs. By combining this double-core focus with a broad vision that addresses the diverse and changing research priorities of astrophysics, WFIRST will serve as a dedicated facility and a broader-use observatory.
The ability of a single facility to have such a broad impact—together with its combination of affordability, technical readiness, and low risk—is why the EOS Panel recommends WFIRST as the next large U.S. space mission. The scientific questions WFIRST addresses will certainly evolve during its development. But WFIRST’s versatility—providing wide and deep imaging and wide-field slitless spectroscopy, with diffraction-limited imaging at 2 μm and a high level of system stability—guarantees that this will be a very productive and important facility for astrophysics research.
Probing Cosmic Acceleration
The discovery of cosmic acceleration is among the most exciting science results of our time. After the discovery of the expansion of the universe, astronomers worked for nearly a century to measure its deceleration, to find out how big and how old the universe is and how much matter it contains. When tools and methods finally became good enough to do that job properly, astronomers discovered—to the astonishment of most—that the expansion of the universe is accelerating. Is
this due to the repulsive force of a previously unknown, pervasive mass-energy component—named dark energy but not currently understood—or does it signal a breakdown on the largest scales of Einstein’s description of gravity in his general theory of relativity? Or is this perhaps a clue to something even more exotic and about which even less is understood? The evidence for acceleration is itself compelling, from measurements of galaxy distances used to track the expansion and from studies of ripples in the cosmic microwave background that record the mass-energy density early in the universe’s history. Whether it is dark energy, or a revision in the law of gravity, or something else altogether, these observations are telling us something new about fundamental physics.
The WFIRST program to study cosmic acceleration described here is based on the JDEM (IDECS and Omega) program that was presented to the panel, but it also benefits from the work of a group study within the Astro2010 survey that considered a combined space and ground approach. Specifically, the WFIRST program provides unique contributions to the three methods most strongly endorsed by the Astronomy and Astrophysics Advisory Committee’s (AAAC’s) Dark Energy Task Force: (1) weak lensing; (2) baryon accoustic oscillations; and (3) infrared photometry of 0.2 < z < 0.8 supernovae.
The apparent cosmic acceleration can be determined by measuring either the apparent brightness of “standard candles” or the apparent size of “standard rulers” over the history of the universe. In the general theory of relativity, the detailed expansion history is governed by the equation of state, which describes the relation between pressure and energy density in the universe. The best measurements to date indicate the presence of something like a vacuum “dark energy” that is supposedly driving the acceleration of the expansion of the universe. The most accurate standard candles are supernova explosions (SN Ia), which can readily be measured back to a time when the universe was half its present size. The most promising standard ruler is a “bump” in the galaxy power spectrum—a preferred scale. This is thought to arise from so-called baryon acoustic oscillations (BAOs)—peaks and valleys that were imprinted on the dark-matter distribution by sound waves carried in the primordial plasma of the early universe. A third way of measuring the equation of state hinges on measuring the growth of structures much smaller than the universe itself, for example, clusters of galaxies: in the presence of dark energy their growth is terminated sooner than it would be otherwise. By virtue of their self-gravity, the masses of these structures produces small distortions—weak gravitational lensing—of the shapes of galaxies that lie behind them.
While some of these observations are being attempted with ground-based facilities, observing in the near-infrared from space offers powerful advantages, especially in the 1 < z < 2 redshift range where these cosmological measurements are most effective. This includes better angular resolution for defining galaxy shapes (weak lensing) and the accessibility of the Hα emission line of hydrogen gas for redshift measurements (BAO) over the maximum volume that can be targeted.
Why should WFIRST employ all three methods? Supernovae (in particular, type SNe Ia) give the best measurements of cosmic acceleration parameters at low redshift due to their greater precision per sample or per object. BAO excels over large volumes at higher redshift. Together SNe Ia and BAO provide the most precise measurements of the expansion history for 0 < z < 2 and place significant constraints on the equation of state. Weak lensing provides a complementary measurement through the growth of structure. Comparing weak-lensing results with those from supernovae and BAO could indicate that “cosmic acceleration” is actually a manifestation of a scale-dependent failure of general relativity. Combining all three tests provides the greatest leverage on cosmic-acceleration questions. WFIRST can do all three. The panel thinks it would be far less sensible to build a smaller, somewhat-less-costly facility to carry out one or at most two of these tests.
Can WFIRST “solve” the problem of cosmic acceleration? Only to the extent that it can rule out alternative explanations. The leading contender among these is Einstein’s cosmological constant, with equation-of-state parameter w = −1 giving a negative pressure equal to the energy density. A measurement of a value of w significantly different from −1 would imply the existence of a “dark energy” that is not “merely” a cosmological constant. At present, w is consistent with −1 o an accuracy of about 10 percent, and its rate of change, , is known to ~100 percent. WFIRST could lower the uncertainty in w to ~1 percent and in to ~10 percent, subjecting the cosmological constant hypothesis to a test 100 times more stringent (using the figure of merit proposed by the Dark Energy Task Force) than current observations.
The Architectures of Extrasolar Planetary Systems
Searching for planets orbiting other stars—exoplanets—is a relatively new field, of burgeoning interest to the astrophysics community and the public alike. Most of the 400+ known planets have been found, beginning in the mid-1990s, by observing the “to-and-fro” velocity of neigboring stars as they are tugged by the gravity of orbiting planets. This method is most sensitive to giant planets and especially those that are close to their parent stars. Detection of a solar system like our own remains beyond our reach: the detection of far-out giant planets requires measurements over decades, and detection of a true Earth-twin requires technology beyond the current state of the art. Over this decade, progress in a variety of approaches will begin to change this situation, but for now very little is known about how common are “solar systems” like the one we live in.
NASA’s recently launched Kepler space telescope is up and running and looking for distant planets crossing in front of their parent stars, minutely diminishing their brightness. Kepler’s survey should answer the crucial question of how common Earth-like worlds are in the “habitable” zones of the stars it is monitoring—mostly galactic inner-disk and bulge stars. By the nature of the experiment, the planets that
will be detected by Kepler are thousands of light-years from Earth—too distant for direct study, although the masses of some will be determined by ground-based radial velocity measurements.
Microlensing is another consequence of Einstein’s general theory of relativity: a mass, for example a star, aligning almost exactly along the line of sight to another, more distant star, makes what amounts to a gravitational telescope—the light from the background star is intensified. If there are planets orbiting the star that is “lensing” the background star, they too will amplify the light—briefly—making their presence known. These microlensing events typically last a few days, and the detailed rise and fall of the light yields the planet-to-host-star mass ratio and separation. For most of the observed events, subsequent observations of the host star can determine its mass, which then gives the planet masses directly—accurate to about 20 percent. A spectacular example of the power of the technique is one such event observed with a ground-based telescope, shown in Figure 6.2.
Because planets even smaller than Earth can be detected in this way, at a wide range of distances from the parent star, space-based microlensing is an extremely effective way to conduct a survey of what other solar systems are like: How many planets are there? What are their masses? At what distances do they orbit their stars? In Figure 6.3 the panel shows the exoplanet discovery space of a microlensing planet search compared to those of Kepler and several ground-based techniques. The complementarity of WFIRST (illustrated by the MPF curve in Figure 6.3) to Kepler—the greater sensitivity to planets farther from the star, and its deeper reach compared to ground-based microlensing—is readily apparent.
An exoplanet microlensing program requires continuous monitoring of a few fields containing tens of millions of stars in the galactic bulge for long contiguous periods. In the optimistic scenario that every star has an Earth-like planet (fraction of stars hosting an Earth ), a 500-day microlensing campaign (spread over the first 5 years of the mission) would find ~200 Earth-mass planets (and many thousands of larger planets). In fact, this is the only technique yet proposed that could find planets with masses smaller than Earth’s and is the only likely way to obtain a good statistical sample of planets with masses less than 5 Earth masses. On the other hand, finding no Earth-like planets at all in such a campaign would indicate a well-measured upper limit to the fraction of stars with an Earth-like world, , and would be a stunning result. Figure 6.4 shows the expected number of planets of different masses found over 250 days of monitoring for microlensing events with WFIRST.
Because microlensing observations using WFIRST can detect planets covering a wide range in planet masses and orbits, this will be the most complete demographic survey of other planetary systems—an early but giant step in a fascinating program to study and compare other planetary systems to our own. It is the next stage of a decades-long journey to understand whether our inhabited planet is one of many in the cosmos.
The High Value of Near-IR Imaging from Space
The sensitivity of WFIRST for near-IR imaging will be unrivaled. The requirements for a deep, multiband, infrared-imaging survey are essentially the same as those for the weak-lensing program, and so a by-product will be a spectacular, unprecedented, distant-galaxy survey with 0.2-μJy sensitivity and 0.2-arcsecond resolution over a large fraction of the sky—a boon for studies of galaxy evolution, large-scale structure, searches for high-redshift quasars, and galactic white and brown dwarfs, just to name a few examples. WFIRST will sample as deeply as the
deepest IR surveys yet done—for example, the UK Infrared Deep Sky Survey—over an area 10,000 times larger. Compared to other surveys that have sampled wide areas, WFIRST will reach several thousand times deeper than 2MASS at 2.2 μm and a thousand times deeper than the WISE mid-IR surveyor. The community’s imperative for such a facility can be gauged by the painfully inefficient way existing space telescopes have been used to map deeply areas of less than a square degree—similar studies would require only four exposures for WFIRST.
By providing Hα detections for most galaxies, and photometric redshifts for all, WFIRST will produce a three-dimensional map of the evolving universe over the redshift interval of 1 < z < 3. Such data will also constrain the history of star formation over this epoch. This will link the evolution of galaxies to the growth of large-scale structure during the critical epoch when galaxies grew from adolescence to adulthood. Its sub-micro-Jansky sensitivity would be a perfect match for optical surveys by the Large Synoptic Survey Telescope (LSST), greatly enhancing their utility, and WFIRST could perform time-resolved surveys over more limited areas of sky, complementing LSST here, too. The ability to resolve stellar populations of nearby galaxies—measuring the detailed shape of the stellar giant-branch—will provide unique information on the histories of star formation for a wide variety of galaxy types. A deep survey of the galactic halo will be the most complete study of the history of satellite-galaxy accretion by the Milky Way, and a similar study
can be done for the neighboring Andromeda galaxy. A survey of the galactic plane, with its treasure trove of sites of vigorous star formation pumping energy and heavy chemical elements into the intergalactic gas, will be our most detailed look at galaxy evolution in action.
In addition to synergy with LSST, the surveys done by WFIRST—large and small—will provide high-quality source material for studies of smaller areas of the sky, in concert with ground-based telescopes such as the Thirty Meter Telescope, the Giant Magellan Telescope, and the Atacama Large Millimeter Array, and premier space facilities such as JWST and SPICA. Working with these facilities, WFIRST will play a key role in answering the wide range of questions on galaxy and cosmic evolution identified by the Astro2010 Science Frontiers Panels.
The broad capabilities of WFIRST—carrying out two, very different, dedicated programs and a wide variety of surveys and smaller programs—mean that this facility will be of critical interest to astronomers in a variety of research areas. To contain the cost and risk of this facility, however, the panel recommends that the architecture of JDEM/Omega be adopted and modified only as is necessary to opimize the two core programs: cosmic acceleration and the microlensing search for planets. Mindful of the priority of these two programs, planning for the operation of WFIRST should incorporate broader interests, including those of galactic and extragalactic surveys, stellar populations, and diverse guest observer programs; the panel imagines a newly appointed science working group to address these issues. In summary, the design of the telescope, spacecraft, and focal-plane instrumentation should be left to the project team and focused around the programs of cosmic acceleration and microlensing planet finding, with the science working group helping to construct an operating plan for the facility that can accomplish the combination of the two dedicated programs together with surveys and pointed observations.
The EOS Panel chose to incorporate the study of cosmic acceleration within a broadly capable space mission because it believes that this approach is scientifically compelling and the best use of limited resources by and for our discipline. It hopes that those who have developed concepts for “dark energy” missions with NASA and DOE, including those within these agencies, will find the far greater science productivity of WFIRST a compelling reason to support a broader mission. The panel also recognizes that international collaboration offers the possibility of further benefits. However, it strongly believes that the design, broad purpose, and multiple allocation of resources envisioned in WFIRST should be retained through any process of international partnering.
As a straw-man example for the first 5 years of a 10-year mission, the panel imagines 2+ years dedicated to the cosmic-acceleration program. These observa-
tions will provide more than 8,000 deg2 for the BAO survey (grism) and 4,000 deg2 for the weak lensing (single-band imaging) survey (about half of the JDEM/Omega program) and will produce a large multiband galaxy survey for public archives.5 Dedicated microlensing campaigns of 100 days duration in each of the 5 years could accumulate a significant sample, even within the first few years of the mission. A galactic-plane survey of one-half year, together with about 1 year allocated by open competition, would fill the initial 5-year timeline. Barring any operational problems, WFIRST should continue for another 5 years: peer review would compete augmentations of the cosmic acceleration or planet-survey programs with new or larger surveys and smaller guest observer programs. The initial allocations for cosmic acceleration and microlensing in the panel’s straw-man plan should produce dramatic advances and allow for informed judgment as to the benefit of additional observations. It has not been the panel’s intention here to be overly prescriptive but simply to show how WFIRST might accomplish all that is envisioned.
WFIRST Technical Issues
The concept for WFIRST adopts the hardware configuration of JDEM/Omega, a 1.5-m-aperture, three-mirror focal anastigmat with 2 × 108 pixels from ~50 HgCdTe detectors (JWST and HST-WFC3 heritage) for both imaging (0.33 deg2) and low-resolution slitless grism spectroscopy (0.55 deg2), with high sensitivity from 0.8 to 2 μm. The panel recommends that the focal plane be optimized for the cosmic acceleration and microlensing programs, while avoiding a substantial increase in cost. For example, a change in the number of field segments or the number dedicated to slitless spectroscopy may enhance a broad science program without significantly increasing the cost or complexity of the facility. Mechanisms for exchanging filters and grisms should be the only moving parts in the instrument. To take full advantage of the benefits of observing from space, imaging pixels should be no larger than 0.18 arcsecond, as planned for JDEM/Omega. This will critically sample the diffraction-limited point-spread function at λ = 2.1 μm wavelength; next-generation HgCdTe detectors, if available, will have smaller 15-μm pixels that would lower the wavelength of diffraction-limited sampling to λ = 1.7 μm. With WFIRST deployed to L2 and no consumables beyond propulsion, its planned lifetime could be at least 10 years.
The JDEM/Omega configuration is somewhat different from that of the Micro-
lensing Planet Finder (MPF; Bennett6) mission whose capabilities are represented in Figures 6.3 and 6.4. Although the wavelength range and detectors are the same, WFIRST’s larger aperture would provide higher resolution with better sampling of the point-spread function by smaller pixels. The factor-of-two smaller field coverage of WFIRST compared to MPF is compensated by the larger aperture, such that equal areas can be sampled in equal time. A significant difference is WFIRST’s location at L2 compared to the proposed geosynchronous orbit for MPF, which requires that the microlensing planet survey be carried out in campaigns of approximately 100 days per year instead of the continuous 9-month monitoring allowed by MPF’s orbit.
The more-general, near-IR imaging surveys and targeted observations envisioned for WFIRST would resemble those described in the Near-Infrared Sky Surveyor (NIRSS) document (Stern7). Like JDEM/Omega, NIRSS is also proposed as a 1.5-m-aperture telescope, with a wide-area focal plane populated with 36 HgCdTe array detectors. NIRSS is configured for a full-sky survey sampling 33 percent more coarsely than WFIRST’s (JDEM/Omega) imaging-camera layout: four filters ranging from 1.2 μm to 3.4 μm are to be exposed simultaneously with nine detectors dedicated to each band. In comparison, WFIRST would sample about twice as much area per exposure in a single color.8 The bottom line is that a three-band-survey (~J, ~H, ~K) would require 3 to 4 years of dedicated NIRSS use to cover the whole sky, while a substantial fraction of the sky is readily incorporated into WFIRST’s diverse program.
It is very clear that WFIRST would be a fully subscribed and extremely productive facility. For this reason, and to avoid cost growth, the panel strongly discourages additional instrumentation, such as optical imaging with charge-coupled devices (CCDs) (for which the advantage of space is far smaller than for the near-IR), coronographs, integral field units, and so on. Observing modes other than the “staring” imaging mode described here would increase complexity, cost, and management problems, disproportionate with any benefits. The panel notes that JDEM/IDECS—basically WFIRST (or JDEM/Omega) but including a CCD array camera—was not ranked highly by the panel for such reasons.
D. Bennett, J. Anderson, J.-P. Beaulieu, I. Bond, E. Cheng, K. Cook, S. Friedman, B.S. Gaudi, A. Gould, J. Jenkins, R. Kimble, D. Lin, et al., “Completing the Census of Exoplanets with the Microlensing Planet Finder (MPF),” Astro2010 white paper, available by request from the National Academies Public Access Records Office at http://www8.nationalacademies.org/cp/ManageRequest.aspx?key=48964.
D. Stern et al., “The Near-Infrared Sky Surveyor (NIRSS),” Astro2010 white paper, available by request from the National Academies Public Access Records Office at http://www8.nationalacademies.org/cp/ManageRequest.aspx?key=48964.
There is a possibility of using the WFIRST spectroscopy channels to add a factor-of-two greater area, albeit at a lower sampling scale of 0.35″ per pixel.
WFIRST Cost, Risk, and Trade-offs
The JDEM/Omega spacecraft was analyzed as part of the independent cost assessment carried out by the Astro2010 decadal survey. As a design template for WFIRST, JDEM/Omega rated well in terms of maturity and schedule risk. Technical risk was rated as “medium low” and cost risk as “medium.” The life-cycle cost (including launch, with all costs in FY2009 dollars) according to the project was set at $1.1 billion; the independent assessment came in at $1.6 billion. Based on this input, the panel is convinced that a similar facility with a broad scientific program could be launched in the next decade with a total lifetime cost of ~$1.5 billion, and it urges that this targeted cost be retained.
A concern was raised in the independent assessment about the acquisition of such a large number of HgCdTe detectors. The panel notes, however, that with a Phase B start following the launch of JWST, there should be many years to accumulate these detectors. A more serious concern, also expressed in the assessment, is achieving and maintaining the image quality required for weak lensing over the full focal plane. The JDEM/Omega team recently demonstrated significant progress verifying the performance of the JWST HgCdTe detectors for the specific application of weak lensing, but knowledge of the point-spread function and its time variation remains a mission-driving requirement. Because weak lensing provides the only component of the WFIRST dark-energy program that is sensitive to general relativity and the growth of structure, the panel believes it is vital to commit sufficient resources to address this risk in the WFIRST design as soon as possible. It will then be important to have another independent assessment of the result of this effort and for the WFIRST project, once constituted, to evaluate its merit-to-cost quotient in the context of ground-based measurements of weak lensing (e.g., LSST) and the benefits to the full scientific program. The anticipated schedule of a start in mid-decade should at least prevent this from becoming a risk for significant cost growth, assuming that adequate funds are provided to properly address this risk.
A final concern expressed in the independent cost-risk assessment is the required telemetry rate, which is challenging for a mission at L2 for both of the dedicated programs. Many future missions are likely to demand increased bandwidth, but in the case of WFIRST, this need may become urgent, especially considering the requirements of JWST and other future missions. Onboard data storage and processing, and the development of task-specific data-compression algorithms, may be required. This issue also needs early attention.
IXO AND THE COMPELLING CASE FOR X-RAY ASTRONOMY
Less than 50 years ago the Sun was the only known source of cosmic X-rays. In 1962 a group led by Riccardo Giacconi sent an X-ray detector on a rocket high
above Earth’s atmosphere. The first detected source turned out to be a neutron star—a structure as dense as the nucleus of an atom but as massive as the Sun, the remnant of a supernova explosion.
From its birth, X-ray astronomy has offered a startlingly different view of the universe. X-rays come from extreme objects because extreme conditions are needed to produce an X-ray photon, typically 1,000 times more energetic than a photon of visible light. X-ray photons get their high energies by emerging from gas with temperatures exceeding a million degrees, or from particles moving at nearly the speed of light. An abundant source of X-rays from space, therefore, is always something interesting and unusual. Such sources include supermassive black holes at the centers of distant galaxies, hot bubbles of newly minted elements from supernova explosions, swirling gas around the neutron stars and black holes left by old supernovae, and immense clusters of galaxies. Many such X-ray sources are “invisible”—they cannot even be seen with an ordinary optical telescope—so X-ray astronomy provides unique information that informs a wide range of cosmic questions. However, because of their high energies, X-rays are completely blocked by Earth’s atmosphere—fortunately for life on Earth—and observing them requires a space telescope.
It has been only a few decades since it was learned that the stars, the focus of astronomical research for centuries, account for less than 1 percent of all matter. Ten times as much normal (atomic) matter is in the form of gas, most drifting between the galaxies, never to become stars; X-ray observations are essential for detecting and studying this majority component. Similarly, gas on the brink of tumbling into a black hole emits more X-ray light than visible light, making X-ray observations key for probing the fundamental physics of gravitation.
The Next Step in X-Ray Telescopes
Over the last few decades, NASA has launched many powerful X-ray telescopes, culminating in the Chandra X-ray Observatory, one of the four highly successful Great Observatories. Figure 6.5 shows a Chandra picture of the million-degree gas in the center of a cluster of galaxies and compares it to a Hubble picture of the starlight from the galaxies themselves: X-ray telescopes produce entirely different views of the same part of the universe. But pictures of the sky, whether in X-rays or visible light, record only the most basic information—how the universe “looks.” Astronomers have long recognized that understanding the how and why—the astrophysics—comes mainly from spectroscopy, separating the light into a spectrum of different energies, which our eyes sense as color in visible light. Spectra tell us how the world works.
X-ray cameras have always had a ready ability to do this to some extent: CCD detectors (which produced most of the X-ray pictures on display in this panel re-
port) record not just the location but also the energy of each and every individual X-ray photon. A typical X-ray “picture” is made up of thousands to millions of photons, sorted by energy, and displayed and analyzed in many different ways. However, this is crude spectroscopy. For example, Chandra is arguably the most powerful X-ray telescope ever built because it focuses X-rays with great precision, revealing details smaller than 1 arcsecond on the sky. But even Chandra has only a modest capability to sort photons by their energies: its CCD cameras can separate photons with an energy difference of 2 percent, a spectral resolution of E/∆E ~ 50. This is generally too low to distinguish important spectral features, such as the emission lines of distinct chemical elements and their ions. In comparison, even a comparatively low-resolution spectrum in visible light has E/∆E ~ 1,000.9 The next big step in X-ray telescopes will come from cameras that produce images with higher energy resolution and from traditional spectroscopic observations (with a diffraction grating) at much greater sensitivity for a vastly larger number of point-sources—for example, faint quasars and X-ray binary stars.
Such enhanced capability is essential for addressing many key questions in astrophysics. A more sensitive X-ray spectrometer will probe the early universe: the first clusters of galaxies and the first massive black holes. X-ray maps—with better spectral resolution of extended sources like supernova remnants, galaxies, and galaxy clusters—will show how the abundances and ionization stages of the elements change and provide precise information on temperatures, densities, and motions of the gas. For example, the Doppler shift of an emission line and its shape
reveal how fast the emitting gas is moving and its level of turbulence. Such diagnostics are essential for decoding the important physical processes occurring in any astronomical object. Visible-light telescopes have done this for decades, but they cannot study the many interesting objects observable only with X-ray telescopes.
Advancing Our Understanding of the Universe with the International X-Ray Observatory—IXO
The panel recommends IXO, the International X-ray Observatory, as a maturing project that will effectively address key science questions, open a vast discovery space, and provide for the community a versatile tool that will support all of astrophysics.
The heart of IXO is a large-aperture (3 m2), lightweight, focusing X-ray mirror—more than 10 times that of any previous mission—that should achieve 5-arcsecond angular resolution. The key component of the IXO focal plane is the X-ray Microcalorimeter Spectrometer—a 40 × 40 array of transition-edge sensors (TESs) covering several arcminutes of sky with energy-dependent spectral resolution of E/∆E = 250-3,000. TESs are state-of-the-art superconducting devices that accurately measure the energy deposited by, in this case, X-ray photons. IXO also carries traditional X-ray gratings, optimized for point sources, that will provide a factor-of-three greater spectral resolution compared to current X-ray grating spectrometers, at 10 times the effective aperture (between 0.5 and 2.0 keV). Figure 6.6 compares IXO to contemporary X-ray facilities in terms of effective aperture (light-gathering power) and energy resoluton. A schematic view and layout of basic components is shown in Figure 6.7.
The microcalorimeter instrument is, in effect, a powerful X-ray integral-field spectrometer—a revolutionary new capability: moderate spatial and spectral resolution over substantial areas of sky. IXO’s large collecting area also delivers sufficient photons to resolve the time-variability of spectra from black hole accretion disks, neutron stars, and AGN over intervals of tens of seconds. IXO’s combination of spatial and spectral resolution and large throughput will enable scientific breakthroughs much beyond the capabilities of current X-ray telescopes.
The science questions IXO will answer are broad and penetrating and, for the most part, impossible to answer without X-ray spectroscopy. For example:
Does energy feedback from a supermassive black hole—an AGN or a quasar—suppress star formation in its host galaxy? How does the energy pouring from supermassive black holes affect intergalactic and intracluster gas?
What are the energetics and heavy-chemical-element content of galactic winds, and how do they affect a galaxy’s surroundings? How much energy and heavy-element material escape from galaxies into the intergalactic medium?
How much intergalactic gas is there, and how hot is it?
What are the masses and spins of black holes? What spins up black holes, and what determines their masses?
Are the progenitors of Type Ia supernovae always the same kinds of objects?
At what mass threshold does an exploding star leave a remnant black hole instead of a neutron star?
What does the neutron star equation of state tell us about the theory of quantum chromodynamics?
How does structure formation proceed in the densest regions of the universe—clusters of galaxies—gauged by the evolution of the hot gas component that dominates the baryonic mass?
In the following, the panel expands on some of these science questions and the role IXO will play in answering them. The discussion is organized by the themes represented by four of the five Science Frontiers Panels of Astro2010.
Galaxies Across Cosmic Time (GCT)
Computer simulations show that the paradigm for structure growth, the ΛCDM model, successfully predicts the mass and distribution of dark matter on large scales. However, the simple assumption that the normal matter—baryons—starts out cold and falls into the dark-matter halos fails to explain key observations—for example, the ratio of numbers of bright to faint galaxies, the colors of the most massive galaxies, and the correlation between masses and X-ray luminosities of clusters of galaxies.
At least one additional ingredient must be added to our models, and this ingredient is generically called “feedback.” Baryons falling into gravitational potential wells of galaxies shock, compress, condense into clumps, cool, and form stars. Star formation itself creates winds of hot gas and leads to supernova explosions that dramatically rearrange the gas, enrich it, and suppress and/or stimulate further star formation. This is a key ingredient of the feedback process.
Another effect results from the relatively small amount of gas that reaches the galaxy’s center, where it can join an accretion disk surrounding a supermassive black hole—millions or even billions of solar masses. As this gas feeds into the black hole, some 10 percent of its rest mass is transformed into energy, producing huge outbursts of kinetic energy and radiation—feedback—that can transform the environment of the host galaxy and beyond, to distances of thousands, perhaps millions, of light-years.
Chandra has revealed powerful shocks arising in the cores of galaxy clusters in the nearby universe—a central black hole is the only plausible explanation. Figure 6.8 shows such a galaxy with radio jets and a hot X-ray halo. But how can a radio galaxy, ejecting energy in two directions, heat the intracluster gas in all directions? Theorists have proposed various mechanisms to transport the energy through the cluster, including magnetic fields, conduction, and cosmic-ray heating, but there are few data to test even the most rudimentary models.
The IXO high-resolution imager will provide diagnostics, from measurements of the total masses and galaxy contents of clusters to the turbulence and bulk motions of the intracluster gas. The violent deaths of massive stars in galaxies also drive winds, and the kinetic and thermal energies and the distribution of elements in the winds can point back to different types of supernovae. Such previously unobtainable data are crucial for progress in the study of feedback.
Also key to the story of galaxy evolution across cosmic time is the answer to the fundamental question, Where are the baryons? Although undetected, most of
the baryons in the universe must nevertheless lie between the galaxies, in a hot, tenuous gas only slightly polluted by heavy elements. The detection of this hot, diffuse, intergalactic gas has been a prize pursued with XMM and Chandra spectroscopy, but the goal has remained elusive. IXO, with its high-resolution grating, will be able to provide definitive answers as to whether this hot gas is common or rare, what is its physical state, and how it relates to the ubiquitous cooler gas seen by O VI absorption in UV light.
Galactic Neighborhood (GAN)
Circumgalactic matter is heated, shocked, and enriched with heavy chemical elements by powerful galactic winds driven by the explosions of stars. IXO will detect and study these winds by using quasars and X-ray binary systems to “backlight” the gas and measure its absorption. H- and He-like lines of O and Ne, and L-shell Fe ions (0.5-2.0 keV) broader than 100 km/s will be resolved, and crucial measurements of absorption-line velocities established to 10- to 20-km/s resolution.
Galactic winds are also “seen” in X-ray emission lines between 0.5 and 2.0 keV. Imaging with the microcalorimeter, sensitive to a few 10−15 erg s−1 cm−2 arcmin−2, will allow studies of the energy content and chemical compositions of the winds emanating from ~40 starburst galaxies and, for the first time, the kinematics of the X-ray gas—bulk flows and turbulence. Figure 6.9 shows a composite X-ray/optical image of a local starburst galaxy, M82, and illustrates the capabilities of Chandra and IXO to study the hot gas erupting from such a galaxy.
Stars and Stellar Evolution (SSE)
X-ray observations provide possibly the best, if not the only, opportunity to determine the masses and radii of neutron stars. The mass distribution can tell us whether there are different paths of stellar evolution that make neutron stars. The mass-radius relation for neutron stars constrains the equation of state of matter at supra-nuclear density. This is a vitally important test of quantum chromodynamics, QCD, which predicts the state of matter in the early universe, quark confinement, the role of gluons, and the structure of the proton and neutron. Figure 6.10 shows the kind of data that IXO will obtain and its application to the problem.
Massive stars end their lives as supernovae—stupendous explosions that are the most violent events in the universe. Figure 6.11 shows the emission from the Tycho supernova remnant, comparing a composite Spitzer/Hubble/Chandra image to a simulated IXO calorimeter image. IXO will provide critical information, such as the Mn/Cr ratio, which is sensitive to the electron-to-nucleon ratio in the progenitor white dwarf, and abundances of trace elements beyond O, Ne, C, Si, S, and Fe in about a half-dozen Milky Way supernova remnants, and it will be able
to acquire spectra of remnants in other Local Group galaxies. Such data can be used to make a definitive determination of the pre-explosion mass that separates a core-collapse supernova forming a black hole from one that forms a neutron star.
Cosmology and Fundamental Physics (CFP)
As mentioned above in the WFIRST section, cosmic acceleration as a manifestation of “dark energy” could in fact be an incorrect diagnosis: a departure from Einstein’s general theory of relativity on cosmic scales is an alternate explanation. WFIRST will test this using weak lensing. IXO can provide an independent assessment by counting the number/mass distribution of clusters of galaxies as a function of lookback time. Essential for this test are better mass determinations than have been previously possible, attainable through the greater sensitivity and spatially extended temperature measurements (Figure 6.12) provided by IXO. Furthermore, comparing the number of galaxy-cluster-size dark-matter halos to the number of lower-mass dark-matter halos (providing a better measurement of the shape and normalization of the halos’ mass power spectrum of halo mass) would be a sensitive test of the neutrino mass, an important goal in fundamental physics.
As discussed above, IXO will measure the mass, radius, and spin of neutron stars, data that has great relevance to fundamental physics. Similarly, IXO’s ability to measure the profile of the Fe K line will give the spin of a black hole of any mass, from stellar-size objects to supermassive black holes, up to billions of times the mass of the Sun.
IXO Programmatic Issues
IXO enjoys the enthusiastic support of the X-ray community and the keen interest of a broad group of multiwavelength astronomers. U.S. astronomers have, over decades, developed close collaborations with ESA and JAXA astronomers that have produced stunning successes. The panel recognizes, however, that a large, international collaboration also carries with it potential inefficiencies and challenges. It is essential that the U.S. IXO team lead a concerted effort to streamline the organization of the IXO project, to clarify the relationships among the partners, and to simplify interfaces, both personal and technical. The panel also thinks it crucial that the international collaboration for IXO share risks—for example, cost growth—as well as opportunities.
IXO Cost, Risk, and Trade-offs
The independent cost appraisal process of Astro2010 descibed IXO as a project with two medium-high risks: (1) an insufficient mass margin for a project at this stage, which could lead to a larger launch vehicle, and (2) the challenge of successfully manufacturing the large-aperture mirror and achieving a spatial resolution of 5 arcseconds. The panel believes that, were these risks to lead to signficant cost growth, they would endanger the entire space astrophysics program. The panel therefore evaluated the impact on some key science programs of a 30 percent reduction in mirror area—a substantial mass reduction—and a spatial resolution of
only 10 arcseconds, the state of the art. It concluded that the ability of the mission to meet its primary science goals would not be heavily compromised; degradation would be mostly quantitative rather than qualitative. Generally, the reason for this is that the principal science goals are photon limited rather than background limited; as such they can usually be compensated for simply by longer exposures. In particular, the premier instrument—the calorimeter—would still provide breakthrough science even with a factor-of-two lower spatial resolution.
The panel’s interpretation of the independent cost appraisal and technical evaluation was that by rescoping IXO in this or similar ways the technical risk can be reduced to “medium.” Further possibilities for reducing cost and schedule risk include the deletion of one or more instruments and/or elimination of the instrument turntable in favor of fixed stations. The panel does not recommend a specific set of rescoping options—its purpose here is to express its conclusion that options are readily available and that they would not substantially degrade the value of the mission. The key to keeping IXO’s scientific priority is to feed a calorimeter with data from a much larger collecting area than has been done before.
The XGS (the grating spectrometer) was also identified in the independent cost appraisal and technical evaluation as a significant advance beyond current capabilities. Losing this capability would significantly degrade IXO’s study of the intergalactic and interstellar media through absorption of the light of background AGN—the calorimeter cannot reach the required energy resolution of several thousand at the energy of interest. Although the loss of this one science program would be significant, the mission would remain as a giant step in the study of the X-ray universe. The panel recommends that the XGS be retained in the context of a “best effort” but that it not be allowed to drive the cost of the mission in a significant way. Again, the panel’s point is that the mission remains a very desirable one even if the proposed performance of the XGS cannot be met.
The IXO project presented a life-cycle cost (including launch; all costs in FY2009 dollars) of $3.2 billion, with a 50 percent U.S. share being $1.6 billion. The independent cost appraisal process reported a life-cycle cost for the full IXO as $5 billion, which at 50 percent would imply a $2.5 billion U.S. share. The panel believes that this level of investment is too high, given projected resources and the necessity of a balanced program. The panel therefore concluded that any needed rescoping must be done as soon as possible, with a target U.S. share not larger than $2.0 billion to be validated by a subsequent independent cost and risk assessment. The rescoped IXO, the one that meets this target, is the one that the panel recommends.
Previous experience has also revealed significant hidden costs in major international collaborations, often at the level of tens of percent. Thus, not only the design issues, but also the definitions of interfaces and responsibilities, both technical and managerial, challenge the ability of the project to keep the U.S. contribution at the
$2.0 billion level. Given the delayed availability of funding, the panel believes that there is sufficient time for the project to demonstrate IXO’s essential technologies, rescope as necessary, reduce major risks, and develop a detailed, convincing plan for the international partnership, all before committing to a new start. Adequate funds are essential to carry out these activities successfully.
MISSIONS TO SEARCH FOR AND STUDY EXOPLANETS
State of the Art
There has been remarkable progress in the discovery and understanding of exoplanets in the current decade. At the time of this writing, more than 400 have been discovered. Nearly all have masses (modulo system inclination) determined by ground-based radial velocity measurements; many are in multiplanet systems. Radial velocity measurements are now made to an accuracy of better than 1 m s−1, and many systems have been monitored for a decade or more. Steady improvements in precision, which may eventually reach 20 cm s−1 or better have already led to the discovery of planets with orbital periods from less than 1 day to more than 10 years and with masses as small as . However, most known planets have large masses, about that of Jupiter or larger, and many orbit their stars in only days or weeks.
This impressive progress is eclipsed, so to speak, only by what has been learned from the 70 or more planets discovered to transit their host stars. Transit measurements yield a planet’s radius and orbital inclination; adding a mass from radial velocity measurements allows the planet’s density to be determined, which is key to modeling its nature. Visible and near-IR spectroscopic measurements during transit using the Hubble and Spitzer telescopes have detected molecules in the planetary atmospheres, and planetary thermal emission has been measured using Spitzer observations when planets pass behind their stars (secondary eclipse). The derived measurements of temperature profiles, compositions, and winds are far advanced from the situation at the start of the decade, when there was only a single known transiting exoplanet, and there were no spectroscopic or thermal emission measurements.
The recently begun CNES COROT (Convection, Rotation, and Planetary Transits) mission has already discovered the first transiting super-Earth, a planet with less than twice Earth’s diameter, with more likely to follow. NASA’s Kepler Discovey mission will continue to monitor a field of more than 105 stars for at least the 3.5-year duration of its prime mission. Kepler should detect more than 100 Earth-size, habitable-zone (HZ) planets transiting their host stars if (the fraction of stars with an Earth-like planet). A null result—no Earth-size planet detections—
would be very significant, constraining to less than 0.1 with high confidence. The actual number of Earth-size planets in habitable zones will not be known until 2013 or later, because all candidate detections by Kepler must be followed up with high-spatial-resolution images and radial velocity measurements to rule out blended stars and spectroscopic binaries. The stars monitored by Kepler are relatively faint, and so radial velocity measurements will not be easy, and further study of the properties of these planets will be very challenging.
JWST’s large aperture, stable environment, and powerful instruments will provide significant capabilities for exoplanet imaging, allowing JWST to detect directly even gas giants that are old and cool, at separations of 1 to 200 AU from their host stars. Transit spectroscopy with JWST will reveal molecular constituents of gas-giant and ice-giant atmospheres and will measure temperatures and abundances, providing insight into planet formation and accretion processes. JWST may even be able to detect molecular constituents in the atmospheres of super-Earths transiting nearby M stars, as well as thermal emissions from their surfaces. This would constitute significant progress toward the PSF Panel’s “discovery area” of characterizing a nearby, habitable exoplanet.
Exoplanet Missions for the Next Decade
Together with ground-based observations, these missions will provide significant data on a wide variety of exoplanet properties, but they are just first steps toward answering important scientific questions about exoplanets. For example, the mass distribution of terrestrial planets will still not be known, because Kepler will measure only sizes. The sample of masses for small planets from radial velocity observations will probably remain small (and biased?) because of the difficulty of measuring such a small velocity shift (~10 cm/s for Earth), the accuracy limited ultimately by stellar variability. Spectra of a true Earth-twin or Jupiter-twin (similar masses at similar distances from a Sun-like G star) are well beyond the capabilities just described.
Without new facilities, little will be learned about the statistical distribution of planetary masses, and virtually nothing will be learned about the properties of individual planets—numbers, masses, densities, atmospheres, and so on—around the closest stars.
WFIRST Microlensing Survey: A Census of Planet Orbits and Masses
The microlensing exoplanet survey proposed for the WFIRST mission will measure the distribution of planetary masses to less than from semimajor axes less than 0.5 AU to beyond 10 AU. This approach will provide high sensitivity
to planets on both sides of the “snow line” at ~3 AU, the orbital distance beyond which gas giants like Jupiter are believed to form. Figure 6.13 compares the sensitivities of WFIRST and Kepler: when combined they will be sensitive to planets with sub-Earth size and sub-Earth mass, from ~0.1 AU to beyond 10 AU, effectively completing the statistical survey strongly urged in the PSF Panel’s report (Chapter 4 of this volume). The prospective WFIRST upper limit of discussed above is somewhat tighter than the constraint from Kepler. Furthermore, while Kepler will be sensitive to Earth-size planets in the habitable zones (~1 AU) of F/G/K stars, WFIRST will be more sensitive to colder Earth-mass planets 0.5 to 5 AU from K and M stars. Since WFIRST results will improve with each succeeding campaign, there is a good chance that WFIRST will eventually succeed in measuring over the entire range of orbital separations.
Surveying and Characterizing the Closest Planetary Systems
The Kepler and WFIRST surveys will determine the statistical distributions of exoplanets, while future transit and secondary-eclipse observations will reveal much about the atmospheres of giant planets and perhaps something about terrestrial planets as well. However, new missions are needed to take the next crucial steps: to search for planets around the closest stars, to measure their masses, and to characterize their atmospheres.
The panel is enthusiastic about one such program, SIM Lite—a 5-year mission to measure planet masses down to for the nearest ~60 stars. Because these stars are mostly within 10 pc, planets orbiting them would be good candidates for future direct-detection (imaging) and characterization missions. The SIM Lite sensitivity to nearby planets is shown in Figure 6.14. Its single-pointing precision of ~10 microarcseconds or better would also allow astrometric measurements of other targets, such as massive stars, neutron stars, AGN, and other targets too faint Planet Mass Sensitivity
for the all-sky ESA Gaia mission. SIM Lite could detect stellar streams in the Milky Way’s halo and map the shape of the galaxy’s dark-matter halo through precise measurements of space motions of stars in the halo.
In the independent cost-appraisal process of Astro2010, SIM Lite was judged to be low-risk and technically ready for Phase C, with a lifetime cost (including launch; all costs in FY2009 dollars) of $1.9 billion, to be compared with the project’s estimate of $1.4 billion. The panel agrees with the Astronomy and Astrophysics Advisory Committee’s Exoplanet Task Force (ExoPTF) and the Astro2010 PSF Panel that a sub-microarcsecond astrometry mission like SIM Lite would be the current best choice for a new exoplanet mission. However, as the third-ranking priority in the panel’s program, any exoplanet mission could start only late in the decade, at the earliest. Because exoplanet science, observation techniques, and technologies are advancing rapidly, the panel recommends that the present preference for an astrometry mission be re-evaluated by an expert panel if and when a new start becomes possible. A mission that is capable of precision astrometry and high-contrast imaging and spectroscopy—a concept emerging recently and still very immature—could be especially powerful. Near-term technology development will be crucial to assess its feasibility.
Alternative missions to study the planetary systems of nearby stars are developing rapidly. Direct detection and spectroscopy are compelling for both their scientific and their “exploration” value. Even low-resolution spectra can distinguish between planetary atmospheres dominated by water or by methane, and can identify non-equilibrium abundances that could indicate the presence of life on other worlds, as highlighted in the PSF Panel report. However, the efficient starlight suppression systems that are needed are not yet mature. It is the judgment of the panel that missions capable of characterizing a significant number of Earth-like planets could not be started before 2020. However, the panel did evaluate, and found appealing, several “probe-class” concepts employing ~1.5-m primary mirrors and internal starlight suppression systems, often coronagraphs with advanced wavefront control. Each was judged to be technically feasible after completion of a several-year technology development program, and could cost significantly less than a precision-astrometry mission like SIM Lite.10 Such a mission could image about a dozen known (radial velocity) giant planets and search hundreds of other nearby stars for giant planets. Importantly, it could also measure the distribution and amount of exozodiacal disk emission to levels below that in our own solar system (1 zodi) and detect super-Earth planets in the habitable zones of up to two
dozen nearby stars. These would be extremely important steps, both technically and scientifically, toward a mission that could find and characterize an Earth-twin.
The level of zodiacal-dust emission from disks around nearby stars is an unknown but crucial factor for designing planet-finding missions. Figure 6.15 is a simulated image from a 1.5-m-aperture coronagraphic space telescope of the radial velocity giant planet 47 UMa b; an exozodiacal dust disk with 3 zodis of material has been added. A terrestrial planet would be much more difficult to detect with this level of exozodiacal dust. Measuring exozodiacal dust to levels of 1 to 10 zodi in the habitable zones of nearby stars is critical for understanding the content and
dynamics of small rocky bodies in planetary systems and for planning ambitious future imaging missions. It is highly unlikely that such measurements will be done with extant missions or from the ground, but the probe-class mission described above—and perhaps even an Explorer-class mission—will do the job. The panel believes that making this important measurement within the decade would be an important step on the path to a future misson to find and characterize planets around nearby stars. Another rapidly developing technique is to discover transiting planets and make sensitive spectroscopic observations of the host star’s light during transit to probe the planet’s atmosphere. The PSF Panel rated this among its highest priorities, and the EOS Panel endorses that position. JWST will be highly capable of studying transiting giant and super-Earth planets around very nearby stars, but it would be inefficient to use JWST to find them. Ground-based surveys are fully capable of discovering transits of giant planets like Jupiter, which would produce a ~1 percent dimming of the star’s light. However, an Earth-size planet would dim the light by only 0.1 percent to 0.01 percent, depending on the size of the star, too difficult a detection for ground-based observations. For small planets, a survey with a high-precision space telescope covering a substantial area of sky will thus be needed to assemble a suitable target list for JWST. A space survey is also needed to find planets sufficiently far from their stars to be cool enough to support life—these would orbit once every few weeks or months. Such a list of nearby stars with small transiting planets will enable JWST to address the PSF discovery area of identifying and characterizing nearby habitable exoplanets.
A recent Small Explorer (SMEX) mission-concept study described a space-based, all-sky transit survey in visible light that could find super-Earth planets for JWST follow-up. The panel also reviewed a concept for a similar survey at infrared wavelengths, which would be more sensitive to planets around the small and numerous M stars. Such a facility might be implemented within the Explorer program, with a development time short enough to yield targets sufficiently early in the lifetime of JWST. The panel believes that the Explorer-class scale is well suited for this activity and looks forward to strong proposals for this approach.
In summary, the panel believes it is too early to choose an exoplanet mission that might be started late in the decade. An astrometric, direct-detection, or transit misson, or something not yet developed, are all prime candidates. Therefore, the panel recommends a generic exoplanet mission as its third priority, after WFIRST and IXO, but leaves this choice to later judgment. Any of the missions briefly described here will make good progress toward addressing the SFP science question PSF 3, How diverse are planetary systems? The more intriguing question, PSF 4, Do habitable worlds exist around other stars, and can we identify the telltale signs of life on an exoplanet? is considerably more challenging. The panel discusses this challenge further in the following section.
AN ADVANCED ULTRAVIOLET-OPTICAL TELESCOPE
The science and mission discussions above frequently resonate with the idea of a more capable UV-optical telescope to follow the Hubble. As discussed above, scientists now know that most baryonic matter is in the form of low-density gases and plasmas in galaxy halos and the intergalactic medium. High-resolution UV spectrosocopy is essential for detecting gas in these plasmas and probing low-density inflows and outflows of matter and energy that profoundly affect galaxy evolution. Figure 6.16 shows real and schematic examples of O VI absorption from such gas in the light of a background quasar (QSO).
Of course, as the Hubble has shown, deep, high-resolution imaging in the near-UV and visible is a touchstone for a good deal of astronomical research, including those programs that are built mainly around other wavelengths, from radio to gamma rays. It is difficult to imagine an advanced UV-optical telescope that would not, in addition to spectroscopic capability, include high-resolution cameras, probably with considerably greater areal coverage than has been possible with the Hubble.
Furthermore, as discussed above in the section “Missions to Search for and Study Exoplanets,” answering what is arguably the ultimate question in this field, Are there other worlds like Earth, and do they harbor life?, is a formidable challenge, and so it is important to acknowledge that a promising way to search for habitable planets and life employs the platform of a large-aperture, UV-optical telescope.
What these three elements share is great promise for a facility that cannot happen without significant technology advances to make these science goals attainable and—especially if a large aperture is required—affordable. The panel views this as a unique opportunity that requires a dedicated technology development program. Because it believes that a UV-optical telescope is a particularly strong candidate for a new start in the 2021-2030 decade, the panel recommends pursuit of several different technologies over the coming decade that will, together, help define what form this keystone facility will take:
Previous UV spectrographs on HST and the decommissioned Far Ultra-violet Spectroscopic Explorer (FUSE) have provided only glimpses of the missing baryons. Significant progress will be enabled by the COS on the Hubble, but such recent UV instruments are still hampered by low-quantum-efficiency detectors and significant light-loss from non-optimal optics coatings. It is estimated that an order-of-magnitude improvement in efficiency is possible. With significant improvements in detectors and coatings, substantial progress in “baryon science” could be achieved with an aperture no bigger than that of the Hubble.
. Measuring the masses and spectra of Earth-like planets around a significant number of nearby stars is a challenging technological problem because the contrast
between the star’s light and that of an Earth-twin would be enormous—a factor of ~10 billion. Although significant advances in coronagraphic and like techniques have been made in the laboratory toward this level of “starlight suppression,” substantial work remains to demonstrate that this is practical and that it can be made sufficiently robust for a space mission. Such concerns have led other groups to promote an alternative: external star shades that serve as artificial moons to eclipse the light of distant stars so that their much fainter planets can be studied. This is an approach with a technical challenge as large as its promise. Concerted development of all types of starlight suppression techniques is essential if the large-optical-telescope approach to exoplanet research is to prove feasible, buildable, and successful.
Could a single telescope do both the baryon and exoplanet science, and include the broad astrophysics program that the Hubble has supported with its superb cameras? It is not obvious. Among the chief concerns, for the case of internal starlight suppression, are the uncertain characteristics of putative high-performance UV coatings: wavefront-cancellation techniques place stringent demands on mirror coatings. External star shades avoid this problem, but the necessity of a second spacecraft that must be launched, precisely deployed, and able to fly in formation at a great distance from the UV-optical telescope could take such a mission beyond the $5 billion expenditure that, at this time, seems to be the upper bound for the cost of an astrophysics space mission.
Since both UV spectroscopy and exoplanet science will be “photon starved,” the quest for a larger-aperture telescope—4 m or greater—will be hard to resist. However, cost is likely to be the limiting factor, and so it was good to learn that several engineering groups are experimenting with radical new ways to achieve large-aperture mirrors for space telescopes at a fraction of the weight and cost of the present generation. Deployable segmented mirrors are but one option.
It is abundantly clear that each of these technology challenges—UV coatings and detectors, starlight-suppression techniques, and bigger, less expensive mirrors—needs to be vigorously supported if a path is to be cleared for building the most capable UV-optical telescope, the Hubble successor. By combining these technology initatives with the science results that will come in this decade—for example, the “missing baryon” search with the COS, the Kepler survey that will tell us how common Earth-like planets are, and possible measurements of exozodi levels for nearby stars—the astrophysics community would be in a far better position to know in what direction to move and how far to reach.
The EOS Panel believes that, if technology developments of the next decade show that a UV-optical telescope with a wide scope of observational capabilities can also be a mission to find and study Earth-like planets, there will be powerful reason to build such a facility. If a single facility cannot effectively carry out both
programs, the panel recommends separate paths, including renewed attention to mid-IR interferometric planet finders (e.g., TPF-I), and the prompt development of a plan for a UV-optical telescope for the 2021-2030 decade.
MODERATE AND SMALL PROGRAMS
“BLISS”: U.S. Participation in the JAXA SPICA mission
The EOS Panel recommends proceeding expeditiously with a high-sensitivity, moderate-spectral-resolution spectrometer for the JAXA-led Space Infrared Telescope for Cosmology and Astrophysics (SPICA). The panel reviewed a description of a specific concept for such an instrument, the Background-Limited Infrared-Submillimeter Spectrograph (BLISS), but the panel’s recommendation applies to any instrument with similar capabilities and cost. For convenience, the panel uses the BLISS acronym in the following discussions, without taking a position on the competitively selected approach.
BLISS represents a consensus approach, as summarized in the U.S. Far-Infrared Astrophysics Community Plan that was submitted to the Astro2010 decadal survey. It provides an exciting opportunity for new science, building on results from Spitzer, Akari, and Herschel. Contributing BLISS will also allow testing of the rapidly advancing far-IR-detector and cooling technologies, in terms of both development for flight and of providing experience and heritage with respect to in-flight performance. The endorsement of BLISS by the infrared community is based on its ability to achieve these goals within a strongly resource-constrained environment. BLISS will help build scientific and technical infrastructure that will enable the more powerful far-IR missions of the future.
SPICA promises breakthroughs in far-IR astronomy. Compared to Spitzer, it will have a much broader instrument suite, with advanced detector arrays having orders-of-magnitude improvements in performance. The 3.5-m aperture of SPICA enables directly both higher sensitivity and resolution, and it combats the confusing noise that was the ultimate limitation for Spitzer in deep, far-IR observations. Compared to Herschel, SPICA will use improved detector arrays (both higher sensitivity and larger formats) and will combine them with a sufficiently cold and low-background telescope to take full advantage of their capabilities. In concert with the other instruments planned for SPICA, BLISS can provide the spectroscopic measurements of faint sources that are essential for many studies. For example, BLISS will measure the strengths of far-IR fine-structure lines in distant galaxies. These lines are important coolants for the interstellar medium, and they characterize both the level of energy input (e.g., from star formation) and the potential chemical evolution of the interstellar gas with redshift. Mid-IR fine-
structure lines that signal AGN will fall within the BLISS operating range even at moderate redshifts (z > 0.5), and so BLISS will have unique power to identify AGN from very early times (z ~ 6) to the epoch when they were most common (z ~ 2), and nearly to the present. The high sensitivity and angular resolution of BLISS/SPICA will let astronomers use far-IR fine-structure lines to probe conditions in star-forming regions within the Milky Way galaxy, for a range of different density and metallicity environments distributed over the galactic disk. It will also be possible to explore protoplanetary disks at the time of formation of gas- and ice-giant planets; giant planets appear to be common, but our current understanding of how they form is very incomplete.
The Necessity of a Healthy Explorer Program
The Explorer program has been a key component of the NASA portfolio since the earliest days of the space program. The relatively low-cost astrophysics Explorers have been highly productive and have provided much of the transformational science of the past 50 years, for example: (1) UHURU, the first X-ray sky survey; (2) IRAS, the first all-sky infrared survey, which discovered protoplanetary dust disks around nearby stars; (3) COBE, finding compelling evidence for the big bang by demonstrating the exact blackbody spectrum of the CMB, detecting the primordial density fluctuations that have led to stars and galaxies, and discovering a cosmic infrared background of starlight absorbed by dust over cosmic time and re-radiated in the IR; and (4) WMAP, mapping the CMB with sufficient precision to measure accurately long-sought cosmological parameters. There are many others.
The Explorer program is also an essential programmatic element of the space-science program: (1) it allows focused investigation of key questions not readily addressable with general-purpose missions; (2) it enables relatively rapid response to changing scientific knowledge; and (3) it permits the use of highly innovative technologies at lower risk than would be the case for large missions.
The Explorer program was very active between 1995 and 2003, with six MIDEX and five SMEX missions selected for flight (although two were subsequently canceled) at an average program cost per year of about $200 million (real-year dollars). However, since 2003 there has been a steep drop in the frequency of Explorers: only one astrophysics Explorer has been selected since 2003—the recently chosen Gravity and Extreme Magnetism Small Explorer (GEMS), which is currently in the formulation phase. WISE was launched in December 2009 and NuSTAR is in development, with a launch planned for 2011. Projected budgets for the combined Astrophysics and Heliophysics Explorer programs are projected to increase $170 million per year (real-year dollars) for the next 5 years, significantly below the 1995-2003 level in real terms. Further compounding the problem is the lack of a reliable, affordable launcher for MIDEX-scale missions since the shutdown of the
Delta II program. The panel strongly recommends that NASA encourage initiatives in private industry to find a robust replacement. Re-establishing a higher launch rate for Explorers would be an important incentive toward that end.
The 2000 decadal survey endorsed “a vigorous Explorer program” (AANM, p. 28). The panel endorses that objective as well, reiterating the goal of the astrophysics community, and NASA, of an annual astrophysics Explorer launch. The 2006 NRC report Principal-Investigator-Led Missions in the Space Sciences (The National Academies Press, Washington, D.C., 2006) offers extensive advice for strengthening the Explorer program, together with insights from the community that should help to improve its effectiveness as it is re-invigorated. The EOS Panel’s endorsement includes a recommended augmentation of the present support level for astrophysics Explorers that would restore a launch rate of one Explorer per year by the end of the decade (see below the section “Funding a Balanced Program”).
The Suborbital Program
The Suborbital program is a small but essential part of NASA’s overall program of science and technology development. Over the past decade it has had multiple successes across broad areas of science, including measurements of the CMB, IR studies of the early universe, and detection of cosmic rays and even neutrinos. Along the way, suborbital experiments have tested technologies and techniques that enable future missions. Because of the relatively low cost, the program also offers an excellent environment in which to train graduate students and young post-doctoral scientists. Many leading astrophysicists, including many leaders within NASA, gained invaluable early experience in the program. The generic utility of the Suborbital program is widely recognized by the community. Of the programs submitted to the EOS Panel by the community, ~25 percent of them incorporated suborbital work (the bulk of which were for the balloon program), ranging from simple checks of technology, to crucial pathfinders, to full suborbital-based science programs. Easy, cost-effective access to near space is essential.
The study of the CMB is a case in point. Our remarkable progress in cosmology is due primarily to innovation in detectors and instruments that continue to push measurements of CMB anisotropies to astonishingly small levels. The CMB community informed the panel that it is working toward a next-generation satellite (a Planck successor named CMBPol) to tackle the exceedingly difficult measurements of CMB B-mode polarization that probe cosmic inflation and gravitational radiation from the big bang. The community stated that informing the design of this next mission requires continuation of a combination of technology development, ground-based, and balloon-borne experiments for the next few years. Support is currently at an adequate level, but it is essential that this support, and flight opportunities, continue. The contingency of the detailed characteristics of CMBPol
made it difficult for the panel to prioritize this mission; nevertheless, the panel recognizes how vitally important this research is to astrophysics: the panel gives unqualified, vigorous endorsement to this program.
More generally, the Suborbital program has come under increasing stress in recent years by being tapped to subsidize the technology-development program, particularly in the area of new detectors. At the same time, in the case of the balloon program, the payloads have become ever more capable and consequently more costly. The net result has been a drastic reduction in the number of payloads that can be supported, and a corresponding reduction in the number of flights.
In the coming decade, when budgetary constraints will limit the number of much-more-expensive satellite programs, increasing support for the Suborbital program is a priority. By doubling the access to near space, multiple areas of study will remain vibrant at relatively low cost. This will naturally require a corresponding investment in the program offices. In addition to addressing many of the key science objectives of the current decadal survey, it will help to ensure that NASA will enter the following decade in a strong position in terms of technology and expertise.
As the capabilities of the Suborbital program increase, the sophistication of the proposed instruments also increases. For example, groups are now proposing coronagraphs from balloon platforms. This bold initiative could provide unique measurements while obtaining invaluable experience with the technology. New initiatives in the ballooning and sounding-rocket programs also have the potential to increase greatly the amount of time available to payloads. The ultralong-duration balloon (ULDB) program will provide 100-day, mid-latitude flights. With sufficient support, this relatively mature program could be returning science early in this decade.
Over the last decade, the balloon program has dominated the science return from the Suborbital program for astrophysics missions. It seems unlikely that this situation will change without a significant advance in capabilities, for example, with multiorbit sounding rockets with vastly increased integration times. While still only a concept, this could transform the scientific and technological value of the sounding-rocket program. Implementing a program of multiorbit sounding rockets is estimated at $25 million, with subsequent missions costing approximately $15 million. Similarly, while the cost of ULDB missions will be incrementally larger than that of previous ballooning programs, the sophistication of their payloads will require funding levels beyond the program’s current budget. Given the scope of these initiatives, the panel recommends that they should be funded outside the current program, perhaps competed within the Mission of Opportunity or Explorer lines. This will help to ensure that the low-cost management style of traditional suborbital missions is retained. In the meantime, the EOS Panel recommends that the return on science and technology development be taken as a significant
consideration when determining the balance of funding between the rocket and balloon programs in the next decade.
THE R&A PROGRAM: INCREASING SUPPORT FOR TECHNOLOGY DEVELOPMENT AND THE SUBORBITAL PROGRAM
NASA’s R&A program funds a wide variety of essential activities. A $70 million investment in 2008 supported the processing and archiving of data from NASA missions and provided research grants to guest observers to produce science. Of a further $68 million, half was allocated to development programs, including research-grant support for theory and fundamental physics, data analysis tools, and laboratory astrophysics, and half to technology development and the Suborbital program. In this section the panel focuses on the vital contribution of these latter activities and provides a rationale for their increased funding, a step that will, in fact, raise the level of support for all R&A activities.
Three Types of Technology Development Activities
NASA has built its successes of the last five decades on a foundation of new technology. Continued progress depends on fully developing the technology for high-priority missions, supporting development for future missions, and providing for new enabling technologies for yet-to-be-conceived missions.
Missions selected for development support their individual technology needs from the mission line. However, prior to starting formal development, a thorough understanding of the technology is necessary for accurate budgeting and scheduling. For this reason, the viability of the high-priority missions recommended in this panel report depends on investments made early in the decade for specific and targeted technologies to bring them to appropriate technology readiness levels (TRLs).11 Examples of the technologies that could be developed under this program are described below.
The missions recommended by the panel will not address some key science goals described in the reports of the Astro2010 Science Frontiers Panels; therefore, some specific areas of technology effort are necessary to develop future missions that will. The longer time-horizon argues against highly targeted development—a more general approach is better at achieving the required advances. Specific examples, such as detector and optics development, are called out in the sections below.
Finally, imaginative and innovative approaches to solving problems have always prepared the way for significant leaps in capability. Support of novel ideas is
crucial to the future vitality of NASA, even when some approaches are not directly applicable to planned missions. Indeed, it would have been impossible to predict a decade ago today’s advances in areas such as detector technologies and optics.
Technology Development for Recommended Missions
Most of the high-priority missions for the next decade already have well-developed programs and technology (Figure 6.17). For example, the WFIRST mission has direct technology heritage from the WFC-3 on Hubble and NIRCam on JWST. While a substantial number of detectors is required, the HgCdTe near-IR detectors for JWST meet all the current requirements for WFIRST.
As described above, IXO has been in development in one form or another for more than a decade. Techniques for building efficient large-area mirrors, X-ray microcalorimeters, and gratings are advancing at a pace to allow IXO to enter Phase B by mid-decade.
A micro-arcsecond astrometry mission is considered a strong candidate for the next exoplanet mission. At present, SIM Lite is the most thoroughly studied mission, with more than 13 years of design studies. It is ready to enter Phase C and would therefore require minimal technology development. In the EOS Panel’s recommended program, however, an exoplanet mission cannot start until late in the decade. In particular, NASA should ensure that the sophisticated technologies already developed in the SIM program are not lost. However, by later in the decade—when an exoplanet mission might be selected—SIM Lite may not be the best way forward. NASA should continue to respond to new technologies in this area to provide a broad range of options when the time comes.
A new UV-optical observatory requires significant progress in detector manufacturing and testing to build large focal plane arrays. These types of detectors might also benefit from suborbital testing. Along the same lines, the BLISS-like instrument for SPICA requires early investment in bolometer detectors, readout electronics, and cryogenics in 2010-2012 to maintain the JAXA/ESA schedule. It is also important to note that such development could feed directly into a U.S.-led far-IR mission in the 2021-2030 timeframe.
Technology Development in Support of Future High-Priority Science
Increased investment in technologies for missions with possible starts late in the coming decade serves several critical needs for NASA. While WFIRST, IXO, and BLISS for SPICA are well along their technology development paths, there remain key technologies for exoplanet and other possible future missions that will require substantially more work. Attention must also be paid to maintaining expertise: insufficiency of resources risks the loss of significant advances already made—many of which would be irretrievable. Finally, testing new technologies on ground-based and suborbital platforms can raise TRLs while returning high-quality science and supporting the field as a whole.
Technology for millimeter to far-IR bands has made considerable advancement, fueled early in the decade by work on missions such as Planck and Herschel—and more recently by ground-based and suborbital CMB experiments—and with facility instruments such as SCUBA2 on the JCMT. However, to prepare for future missions like the proposed CMBPol, CALISTO/SAFIRE, and SPIRIT, several key areas need to be addressed. These include large multiplexed arrays of detectors and their associated electronics, cryogenic coolers, and optical designs that address specific systematic effects (Figure 6.18). This field, in particular, benefits substantially from suborbital testing of technology.
A number of future planet-finding missions will require significant investments in technology in the coming decade to determine feasibility, risk, and cost. Missions for direct detection of planets in visible light require suppression of starlight to better than 1 part in 1010. Although this level of suppression is deemed a tractable technological problem, significant investments in traditional and non-traditional coronographs, coatings, and deformable mirrors will have to be made.
Technology for the Future
The directed program outlined above is meant to close technology gaps in order to move ahead with high-priority missions in key science areas. However, new ideas have the potential to revolutionize the field, and so it is important to maintain the current level of support in the R&A program for advanced technolo-
gies, particularly ones with the potential for paradigm shifts. Which programs to support will, as always, be determined through the peer-review process. Potential areas of advancement include new mirror and optical technologies, star shades, and new types of nulling interferomety (Figure 6.19).
To pick one example, external star shades may allow enhanced planet searches with telescopes that are also well suited for ultraviolet observations, thus serving two high-priority science programs and a wide array of science investigations. However, to make star shades feasible it may be necessary to invent new approaches to manufacturing, testing, deployment, and formation flying. These initial explorations are appropriate under the current R&A approach, where new ideas are funded based solely on peer review and not on specific development goals. If these issues can be resolved and star shades can approach TRL 3 or 4, a program focused on their application in future missions would become a high priority. This example
illustrates both the importance of an open, peer-reviewed technology program and the necessity of a mechanism to transfer a successful paradigm-shifting technology development to the more focused efforts needed for applications in future missions.
An advanced UV-optical telescope would address many of the key questions identified by the Astro2010 Science Frontiers Panels. However, achieving a viable program by the end of the decade will require advances in detector technology, wave-front sensing and control, and UV-optical mirror coatings. Most importantly, new approaches to very lightweight yet highly accurate monolithic or segmented mirrors may be needed.
Increased Funding for Technology Development and the Suborbital Program
Funding levels largely determine whether required technologies are developed successfully—closing the technology-readiness level gap has always been challenging. Seed money has in the past come from the R&A program. However, the program budget has dropped from $79 million in 2004 to a projected $65 million in 2010 (real-year dollars, based on an annual inflation index of 1.03). Because the R&A program supports a wide variety of research, including technology development, the Suborbital program, laboratory astrophysics, and theory, all of this research has been adversely affected by declining funding levels. While the discussion here focuses on the technology and suborbital programs, all areas will benefit from the recommended budget increases.
The panel believes that the R&A program has been strained beyond its ability to meet the needs of NASA: the panel thus recommends a significant increase to correct for declining funding in the 2001-2010 decade. Specifically, $20 million per year inside the current budget should be targeted to technologies that address high-priority science questions not covered by the EOS Panel’s recommendations for new starts in this decade. Based on the documents submitted to the panel by the community, an additional $20 million per year should be distributed more broadly to foster new ideas, with the aim of bringing them to TRL 3 to 5. The panel’s evaluation of the needs for suborbital efforts, based again on documents submitted to the panel, shows that at least an additional $15 million per year is required. For the Suborbital program, the expectation is that $15 million per year in new funding combined with part of the $20 million in recommended new-technology funding will greatly enhance the number of missions. In recognition of the dual role of many suborbital missions, those that have a significant technology component should have access to both lines of resources through a single proposal. There are also some ideas for new, more costly suborbital initiatives (ULDB and sounding-rockets-to-orbit) that would need funding outside the traditional Suborbital program, possibly through the Mission of Opportunity or Explorer lines. In summary, the panel recommends an annual augmentation of $35 million to
the existing R&A program: $20 million for new technologies, and $15 million for demonstration of technologies and science programs in the Suborbital program.
FUNDING A BALANCED PROGRAM
Projected NASA budgets provided to the panel show little or no funding for new initiatives before 2014. Thereafter, the completion of JWST leads to substantial, increasing resources in excess of the “base budget,” which includes mission operations, R&A, Missions of Opportunity, and program management. The integral of the funds available for new initiatives from 2014 through 2020 could be as large as $4.0 billion (real-year dollars) with present projections. However, although this figure includes operating funds for JWST and SOFIA, it has no allowance for possible cost growth to complete them, and it does not include continued operations of existing missions such as HST and Chandra after 2015. The panel was also shown a projected budget that included assumptions about these possible added expentitures that arrived at a much lower figure of $2.3 billion for new initiatives in this decade. Conversion of these figures to FY2009 dollars would mean even less money for new initiatives by ~15 percent, unless adjustments for inflation occur.
In this final section, the panel attempts to describe the execution of the panel’s recommended program bracketed by these estimates of available funds for new initiatives in the 2010-2020 decade. In Table 6.3 the panel shows funding levels and priorities for the recommended program. The panel takes as a nominal budget the middle ground, the center (green) allocations, $3.1 billion FY2009 dollars, which it considers the minimum balanced program. Sufficient funds are identified to build WFIRST, to augment the Explorer program for astrophysics by $500 million over the period, to start IXO,12 and to fund a small exoplanet mission or start a larger one. The Explorer program, in fact, is part of the base budget, but the panel includes the augmentation it is recommending in the “new initiatives” category because the amount it recommends is too large to be accommodated by changing priorities within the base budget. When combined with the present level of funding for astrophysics Explorers, this is a sufficient allocation to raise the launch rate to one per year after 2015. The R&A augmentation (shown in green), which encompasses the additional investments discussed in this section, is intended to be phased in by rearranging priorities of the base budget and as such is not counted here as a new initiative. The U.S. contribution of a BLISS-like spectrometer to SPICA is also as-
TABLE 6.3 Funding the EOS Panel’s Recommended Program at Different Levels
sumed to be within the base budget (a NASA solicitation for science-investigation concept studies for SPICA has already been issued); approximately half of the estimated $200 million cost of SPICA participation is projected to occur before 2014.
An enhanced budget (blue) would increase the amount spent on IXO through 2020 to $1.0 billion, and allocate $700 million for an exoplanet mission. The exoplanet mission could be a SIM Lite start, a fully funded probe-class mission, or an investment in a more ambitious mission for the following decade. As described above, this choice would be made later in the decade by selective competition between SIM Lite and one of the alternatives discussed in the section “Missions to Search for and Study Exoplanets.”
In Table 6.3, the budget below “nominal”—in yellow—would provide a bare start for IXO, but no funding for an exoplanet mission. In this scenario, it is crucial to maintain a healthy Explorer program so that some diverse science goals can still be pursued, albeit at a much more modest level. However, the panel does not consider this level of funding sufficient for a balanced program.
In any scenario, the panel gives highest priority to building WFIRST as expeditiously as possible, with the goal of launching it within the decade, and then to augment the funding for astrophyscis Explorers as rapidly as possible. Any additional funding should next go to IXO, and then to the Exoplanet mission—consistent with the priorities of this panel’s report. Even in the best case, building IXO will require substantial further investment in the 2021-2030 decade. In the minimal budget case, funding is only sufficient to ensure that IXO would have enough support for technology development to qualify for a new start, a recommendation that would likely defer IXO to the 2020 decadal survey.