As a major new scientific instrument, the design, construction, and operation of an electron-ion collider (EIC) would offer significant benefits to other fields of science and to society. In this chapter, these contributions and benefits are summarized.
Chapter 4 described two concepts to realize an EIC accelerator that have been developed: one based on the existing Relativistic Heavy Ion Collider (RHIC) complex at Brookhaven National Laboratory (BNL), called eRHIC; and a second based on the existing Continuous Electron Beam Accelerator Facility (CEBAF) accelerator at the Thomas Jefferson National Accelerator Facility (JLab), called the Jefferson Laboratory Electron Ion Collider (JLEIC). Ring-ring concepts have been developed for both eRHIC and JLEIC. An advanced linac-ring concept using an energy-recovery linac (ERL) has also been developed for eRHIC although the ring-ring option is now preferred by BNL. All EIC concepts are technically challenging and motivate a significant research and development (R&D) effort in the United States. This effort addresses fundamental issues in accelerator physics that are of broad interest beyond the nuclear physics community. Several examples of EIC R&D research are highlighted here.
Applications of ERLs
Compact and cost-effective ERLs can be used as drivers for high-power free-electron lasers (FELs), which are photon sources with applications in many fields of scientific research and industry. A number of ERLs have already been constructed around the world and there are plans for more in several laboratories.1 The ERLs required for the EIC are among the most demanding designs under consideration.
Strong Hadron Beam Cooling
High-energy bunched-beam cooling was a spectacularly cost-effective upgrade to the luminosity performance of RHIC. However, the stochastic cooling technique used at RHIC is not powerful enough for the requirements of future hadron colliders such as an EIC. The novel concept of coherent electron cooling (CeC), developed by scientists at BNL and JLab, holds the promise of a very high-bandwidth and a fast method to cool hadron beams and achieve high luminosity. If the proof-of-principle experiment at BNL eventually leads to successful implementation, it would potentially be of very high interest for other future hadron colliders in a range of energies, perhaps up to that of the Large Hadron Collider (LHC).2 Even if no such colliders are envisaged, beyond an EIC itself, establishing this principle and rendering it operational would be a tour de force of advanced accelerator technology, which would surely have multiple indirect benefits.
Superconducting RF Technology
Particle acceleration by means of superconducting radio frequency (SRF) cavities is an established technology with over 30 years of application. Early deployments included the recyclotron at the Hansen Experimental Physics Laboratory (HEPL), the microtrons at Illinois, in the United States, and Darmstadt, Germany, and the Cornell Electron Storage Ring at Cornell University, as well as large systems in the electron-positron colliders TRISTAN (at High Energy Accelerator Research Organization [KEK], Japan) and LEP (at the European Organization for Nuclear Research [CERN]), which allowed high energies to be reached without prohibitive power consumption. In the United States, the CEBAF accelerator pioneered
2 Coherent electron cooling is not obviously useful at higher energy colliders like FCC-hh, discussed in Chapter 5, where the beam energies are so high that natural synchrotron radiation damping already provides sufficient cooling.
the application of SRF technology on a large scale and JLab is a national center of expertise for SRF. The technology continues to evolve to meet the challenges of very high current beams and the R&D for machines like B-factories, light sources, and the EICs. Considering the acceleration systems required for the eRHIC storage ring and the injector ERL, there are clear synergies and common developments in the areas of multicell superconducting cavities and high-power adaptive couplers.
Besides the main accelerating cavities, the EICs require crab cavities installed close to their interaction points to locally rotate both beams and enhance the luminosity when they collide. Here there is a very strong synergy with the developments under way at BNL of similar devices for installation in the High-Luminosity Large Hadron Collider (HL-LHC). Crab cavities were operated successfully in a hadron accelerator, the SPS at CERN, for the first time only very recently.
Electron Cloud Mitigation
Electron clouds are a major challenge for all accelerators with positively charged high intensity beams. Operating accelerators (e.g., RHIC, LHC, and Super KEK-B) devote significant time to “scrubbing” the vacuum chamber to reduce secondary electron yield and there are well-established collaborations performing R&D to mitigate this problem. To achieve a further factor of 2 in proton beam intensity beyond RHIC, in situ coating techniques are under development for eRHIC at BNL. This technology would apply a low-impedance 10 μm copper layer to the present stainless-steel RHIC beam pipe which would then be further coated with carbon to reduce the secondary electron yield. Related developments for the HL-LHC are under way at the Cold Bore Experiment (COLDEX) facility in the CERN SPS. Progress in this effort will benefit future facilities with intense hadron beams in cold-bore beam pipes and, possibly, those with intense positron beams.
The interaction regions of the present EIC design concepts have to accommodate and strongly focus incoming and outgoing beams of very different energies; allow the installation of crab cavities, spin-rotators, and other elements; and minimize the exposure of the detector to synchrotron radiation. These requirements result in complex and highly constrained geometries and optics, which require special magnets. In the EIC concepts, the fields of the large-aperture high-gradient superconducting quadrupoles that focus the hadron beams have to fall away sharply in the transverse direction to avoid disturbing the electron beam. The design solution has the electron beam passing through special “hoses” in the return yokes of these quadrupoles. These have clear synergies with magnets proposed for the Large Hadron-Electron Collider (LHeC) and Future Circular Collider (FCC-eh)
whose design faces similar problems at still higher energies. The electron beam’s own focusing has to be provided within actively shielded quadrupoles similar to those proposed for the International Linear Collider (ILC). Bending of that beam will likely require actively shielded super-ferric dipoles.
More generally, the development of the numerous special magnets required for an EIC will build on and sustain the world-leading capabilities of the Magnet Division at BNL, an important resource for accelerator developments in the United States. Magnets using Nb3Sn superconductors are foreseen to have many applications and are already being applied to the HL-LHC in particular.
High-Current Polarized Electron Source
R&D is in progress at BNL and the Massachusetts Institute of Technology on the development of a high-current (50 mA), polarized (80 percent) electron gun that would be needed by the ERL-based design of eRHIC. Although reaching a goal so far beyond the present state of the art was identified as the major technical risk motivating the switch to the storage ring design, the outcome of this R&D could be of importance for future linear collider projects and the existing CEBAF facility. Were it to proceed rapidly enough to permit a switch to the alternative ERL design, it could reduce the costs of construction and operation of the EIC.
A highly qualified workforce trained in nuclear science is vital to the nation’s health, economy, and security. Nuclear science is especially relevant to confronting some of the most pressing issues facing humanity. Nuclear weapons control, carbon-free energy production on a large scale, counter-terrorism, and nuclear medicine are all areas where nuclear physicists play a leadership role. Furthermore, nuclear physicists serve in governments worldwide in leadership positions that address these critical issues. Figure 6.1 shows the distribution of careers of nuclear science Ph.D. recipients from 2006 to 2009.
A landmark study of education in U.S. nuclear science in 2004 recommended an increase of about 20 percent in 5 years in the production of Ph.D.’s in the field. This was significantly motivated by critical needs for nuclear expertise in the area of national security. The most recent assessments report that, at best, U.S. Ph.D. production has been flat. Increasingly, individuals who receive their Ph.D. outside the United States fill positions for young nuclear physicists. Furthermore, the most recent assessments specifically identify workforce challenges in the areas of accelerator science and high performance computing. An EIC can play a very valuable role in sustaining the U.S. nuclear physics workforce for the coming decades.
World-leading discovery science in the United States requires that the nation’s accelerator-based, national user facilities have world-leading capabilities to answer the important, open questions in nuclear and high energy physics, in materials, biological and chemical sciences, as well as in applications of these fundamental fields. Next-generation accelerators in the United States such as an EIC will be more challenging to build and to operate safely and cost-effectively than earlier accelerators and detector systems. Highly reliable, small accelerators are also central to advances in medicine and industry, constituting a multibillion-dollar enterprise with over 30,000 particle accelerators in the world.3 Many of the most exciting innovations in medicine and commerce arise from small commercial companies located contiguous to the accelerator laboratories and universities.
A sustainable supply of highly skilled scientists and engineers is required to meet the challenges in developing accelerators for fundamental science and applied research and development. As the most critical areas of relevant technical expertise are rarely taught in U.S. universities, the Department of Energy (DOE) Office of Science and the National Science Foundation (NSF) must provide workforce development opportunities to ensure rigorous, structured training for graduate students and post-doctoral scholars and the staff already at its national laboratories.4
3 U.S. Department of Energy, Accelerators for America’s Future, Office of Science, Washington, D.C., 2010.
4Assessment of Workforce Development Needs in the Office of Nuclear Physics Research Disciplines, Report to NSAC from the Subcommittee on Workforce Development, J. Cizewski (Chair), July 18, 2014.
The U.S. accelerator physics community numbers about 1,100,5 including staff at the accelerator laboratories and in private industry, and about 35 tenured or tenure-track faculty, staff, and students at about 15 universities. The scarcity of formal Ph.D. programs and lack of advanced graduate-level courses in accelerator science and technology in U.S. universities is directly addressed by the U.S. Particle Accelerator School (USPAS), an effective partnership of major research universities and DOE laboratories. The USPAS plays a particularly important role in educating and training young scientists by providing high-quality courses on essential topics in the physics and technology of beams delivered by world experts at locations around the country on a semiannual basis. Although USPAS courses are typically not held in a traditional campus setting, training modules are academically rigorous and carry direct university graduate credit from the host university for each session.
In nuclear physics, the user facilities at BNL, JLab, and Michigan State University (MSU) retain the vast majority of accelerator physics scientists and engineers in support of operations at RHIC and CEBAF, and of the construction of Facility for Rare Isotope Beams (FRIB) at MSU, respectively. All three of these institutions have active programs in accelerator physics with support coming from DOE/Nuclear Physics, DOE/High Energy Physics, and NSF. The BNL and JLab programs are both supporting relevant R&D for the EIC (and the development of highly trained Ph.D.’s in accelerator science) through the associated Old Dominion University (ODU) and Stony Brook University accelerator physics programs (the Center for Accelerator Physics at ODU and the Center for Accelerator Science and Education that is joint between BNL and Stony Brook). These efforts are important and should receive continued support as preparations are made for the construction of EIC. The 2014 Nuclear Science Advisory Committee (NSAC) subcommittee report on workforce needs in nuclear physics identified6 significant challenges in attracting and developing a talented U.S. workforce in accelerator science and associated technologies. It recommended an expansion of USPAS courses.
Essential to the vitality of the U.S. accelerator community is the need for R&D on cutting-edge technical challenges. This research both engages the best talent and attracts the brightest young minds to the field. In this regard, the high-priority accelerator R&D identified for an EIC—for example, crab cavity operation in a hadron ring, development of ERLs, strong hadron cooling, magnet design, and polarized source development—demands an intensive and systematic focus by the U.S. particle accelerator community. Clearly, success will demand the participation of scientists and engineers across all of the nation’s accelerator laboratories
5 Official Unit membership of the American Physical Society, 2017.
6Assessment of Workforce Development Needs in the Office of Nuclear Physics Research Disciplines, Report to NSAC from the Subcommittee on Workforce Development, J. Cizewski (Chair), July 18, 2014.
for an extended period. Furthermore, it presents a highly leveraged opportunity to grow the small university research community. The design and realization of a high-luminosity, polarized EIC represents a singular opportunity for the U.S. accelerator community in that it will demand that core capabilities are kept at the cutting edge and hence position the United States for other future large-scale accelerator projects.
The goal of understanding how protons, neutrons, their interaction, and nuclei emerge from the strong interaction at a fundamental level calls for the combined strengths of accelerator science, large experiment collaborations and detectors, and theory, each of which are increasingly incorporating advanced scientific computing resources, techniques, and associated research. Nuclear physics, high energy physics, and computing have traditionally had strong synergies driven by mutual interests in high-performance calculations and simulations, in vast data rates and volumes with commensurate analysis demands, and in advanced networking and data sharing. The experiments at the LHC are prime examples, and have been characterized as a resounding success of bold extrapolations and numerous technological breakthroughs.7 The LHC experiments have collectively been at the forefront of beam-collision rate and event-size right from their start, and will continue to push these boundaries well into the LHC era with high-intensity hadron beams and beyond the scale of the envisioned EIC.
The continued rapid pace of technological development is starting to enable a transition from the event-oriented and triggered data-acquisitions of past and current experiments in nuclear and high energy physics to data models where detector subsystems deliver time-stamped streams of data for processing with increasingly integrated and advanced computing resources in real time. The LHCb experiment, for example, is preparing for triggerless readout for LHC run 3 (2021-2023, prior to an EIC) and will process a data rate of about 5 TB/s in real time on its online central processing unit farm. In view of the inherent advantages, as well as new opportunities, this trend is being pursued broadly in nuclear physics, including for example, for experiments with the future Gamma-Ray Energy Tracking Array instrument to be deployed at FRIB.8
Lepton-nucleon scattering experiments past and present provide further con-
7 S. Cittolin, 2012, The data acquisition and reduction challenge at the Large Hadron Collider, Phil. Trans. R. Soc. A 370:954.
8 See for example, the U.S. Department of Energy, DOE Exascale Requirements Review—NP/ASCR, an Office of Science review sponsored jointly by Advanced Scientific Computing Research and Nuclear Physics, Washington, D.C.
text for a future EIC. The completed Hadron-Electron Ring Accelerator (HERA) collider program is of particular relevance. Besides the scientific insights it has delivered and the scientific role it has played, the HERA program also yielded benchmark data on cross-sections and event topologies for projections and advanced simulations for an EIC, as well as invaluable experience for the accelerator, the experiment detectors, and their all-important integration.
EIC luminosities will exceed those achieved at HERA by two to three orders in magnitude. EIC science furthermore requires polarization, heavy-ion beams, a wide range of center-of-mass collision energies, and experiment capabilities to measure a broad range of interaction channels with numerous correlations and in multiple dimensions. Each of these aspects calls for new and detailed simulations to develop full-fledged and optimized designs for the facility as well as the experiments. An EIC will be among the first facilities to come online in the era of exa-scale computing, an era that will see unprecedented integration of computing in the collider and experiments. These developments, combined with continued advances in machine learning and other areas, will open up opportunities for truly new approaches to nuclear physics experiments and analyses of scale, perhaps removing altogether the current distinction between acquiring the data from the instruments and their subsequent analysis.
The exact theory of the strong interaction is thought to be that provided by quantum chromodynamics (QCD), which has the remarkable property of confinement, in which interacting colored quarks and gluons produce colorless nucleons and nuclei as composite bound states. The complex nature of this continuum theory—it does not lend itself to analytic approximation—rules out any direct solution. However a discrete version of this theory, where space and time coordinates become points on a four-dimensional finite lattice, can be solved given sufficient computing resources. Despite the many technical issues that must be addressed—including the choice of how QCD is adapted to the lattice, the consequences of the finite lattice spacing, and extrapolation from finite computable volumes to infinite volume—lattice QCD (LQCD) can yield effectively exact results with known error bars in many applications.
Ken Wilson formulated QCD on a lattice in 1974, arguing that this discrete restriction of the theory could succeed. In the 40+ years that have passed since his formulation, both LQCD algorithms and computing power have advanced by many orders of magnitude. Over the past 20 years, machine speeds have increased from the terascale—1012 flops—to within an order of magnitude of the exascale—1018 flops. Remarkably, algorithm advances spurred on by efforts to solve LQCD have contributed equally to the progress. Today, with “cold” LQCD
techniques, hadron masses and certain weak interaction couplings can be calculated to a precision of 1 percent. The RHIC program motivated an entirely new thrust of the field—the exploration of the phases of QCD with “hot” LQCD methods.
The EIC comes at an interesting time: A new generation of supercomputers with novel architectures are being installed at Oak Ridge National Laboratory, Lawrence Livermore National Laboratory, and Argonne National Laboratory, and entirely new approaches to the solution of field theories are being considered, such as Hamiltonian methods adaptable to quantum computers. The more challenging problems that the EIC will pose are guaranteed to continue to drive important advances in hardware and algorithms.
The EIC, with its luminosity and polarized beam capabilities, will allow us to look at nucleons and nuclei in much greater detail than is now possible, imaging the transverse momenta and positions of quarks and gluons in relativistic hadrons. EIC experiments will tell us how the nucleon’s spin is distributed among its constituents, including the role of sea quark and gluon orbital angular momentum. It will help us learn how gluons interact with each other, fusing and splitting. The committee expects that qualitatively new regimes will be found where the gluons reach an asymptotic density and dominate the dynamics. The EIC will also determine how quark and gluon distributions are altered when nucleons are bound in the nuclei. Such EIC measurements will provide a large set of new challenges for theory generally, and specifically for LQCD.
The parton distribution functions (PDFs) that will be measured at the EIC pose particular challenges for LQCD. The mathematical calculations that arise in LQCD formulations, which are evaluated using simulations on computers, are carried out using a mathematical trick of “imaginary times.” This procedure forces the amplitudes associated with quantum states to decay exponentially, with excited state components diminishing faster than in the ground state. This use of imaginary time allows one to “filter out” excited states until only the ground state remains.
However, this procedure fails for PDFs—which describe the longitudinal momentum structure of the nucleon—as these quantities are defined using a time-dependent correlation between quarks and gluons. The LQCD rotation to imaginary time rules out a direct calculation of such time-dependent correlations. Consequently, until recently, calculations have been limited to evaluations of moments (or certain integrals) of PDFs,9 which can be calculated in LQCD. Unfortunately,
9 G. Martinelli and C. T. Sachrajda, 1987, Pion structure functions from lattice QCD, Phys. Lett. B196:184; M. Göckeler et al., 2001, Lattice calculations of the nucleon’s spin-dependent structure function g2 reexamined, Phys. Rev. D63:074506, and Investigation of the second moment of the nucleon’s g1 and g2 structure functions in two-flavor lattice QCD, 2005, Phys. Rev. D72:054507; D. Dolgov et al., 2002, Moments of nucleon light cone quark distributions calculated in full lattice QCD, Phys. Rev. D66:034506; Ph. Hägler et al., 2008, Nucleon generalized parton distributions from full lattice QCD, Phys. Rev. D77:094502.
technical problems associated with the reduced symmetry of LQCD lattice, relative to real space-time, have so far limited calculations to the first few such moments.10
Recently, however, a promising new strategy11 for directly calculating the PDFs has been proposed. It involves carrying out the LQCD calculation of a modified PDF (called a “quasi-PDF”) and then relating it to the true PDF iteratively, using a tool of theoretical physics known as “effective field theory.”12 This feasibility of quasi-PDF calculations in LQCD has been demonstrated in prototype investigations.13 There are technical issues in the procedure that require further exploration—for example, determining if the lattice extrapolations to infinite volume and vanishing lattice spacings are properly handled through this two-step procedure—but so far studies have supported the soundness of the approach.14
Even with this new procedure, LQCD calculations of hadrons carrying large momenta will be computationally challenging. However, there is great optimism in the field that the theoretical quantities most relevant to the EIC program are now within reach of the technique. This means that LQCD could become the standard tool for interpreting EIC measurements and for guiding its future program. Anticipated algorithmic and hardware improvements over the next decade will help the field reach this goal.
Condensed matter physics is concerned with emergent behavior in many-body systems of atoms and electrons. Historically, nuclear physicists have studied the many-body properties of nuclear matter and finite nuclei. Many of these studies
10 W. Detmold, W. Melnitchouk, and A.W. Thomas, 2001, Parton distributions from lattice QCD, Eur. Phys. J. Direct 3:1; J. W. Negele, 2002, Understanding parton distributions from lattice QCD: Present limitations and future promise, Nucl. Phys. A711:281; Ph. Hägler et al., 2008, Nucleon generalized parton distributions from full lattice QCD, Phys. Rev. D 77:094502; Z. Davoudi and M. J. Savage, 2008, Restoration of rotational symmetry in the continuum limit of lattice field theories, Phys. Rev. D 86:054505.
11 X. Ji, 2013, Parton physics on a Euclidean lattice, Phys. Rev. Lett. 110:262002.
12 X. Xiong, X. Ji, and Y. Zhao, 2014, One-loop matching for parton distributions: Nonsinglet case, Phys. Rev. D90:014051; X. Ji, A. Schafer, X. Xiong, and J.-H. Zhang, 2015, One-loop matching for generalized parton distributions, Phys. Rev. D92:014039; X. Ji, 2014, Parton physics from large-momentum effective field theory, China Phys. Mech. Astron. 57:1407.
13 C. Alexandrou et al., 2015, Lattice calculation of parton distributions, Phys. Rev. D92:014502; J.-W. Chen, S.D. Cohen, X. Ji, H.-W. Lin, and J.-H. Zhang, 2016, Nucleon helicity and transversity parton distributions from lattice QCD, Nucl. Phys. B911:246.
14 R.A. Briceno, M.T. Hansen, and C.J. Monahan, 2017, The role of the Euclidean signature in lattice calculations of quasi-distributions and other non-local matrix elements, Phys. Rev. D96:014502.
were strongly influenced by advances in condensed matter theory—for example, the discovery of pairing in nuclei and the development of Landau Fermi liquid theory.
Emergent phenomena described by these theories include superfluidity in neutron stars and nuclear collective motion. With the development of QCD, new types of many-body effects were discovered. An important example is chiral symmetry breaking, which is associated with the condensation of quark-antiquark pairs, and which can be understood in analogy with pair condensation or magnetization in metals and nuclei.
Indeed, at this point, with a deeper understanding of QCD, scientists are poised to view nucleons and nuclei as collective many-body systems in which quark and gluon interactions lead to new emergent phenomena. One of these is nucleon itself: 99 percent of the interaction energy of a nuclear system is carried by the simple masses of these “composite fermions,” bound states of quarks interacting through gluon exchange. QCD gives rise to completely new many-body phenomena, which are intimately tied to the fact that the gluon interacts with itself, unlike the photons that mediate electromagnetic interactions. One remarkable consequence of this nonlinearity is confinement, the absence of isolated color charges.
An EIC will refine understanding of confinement, but it will also study completely new types of many-body phenomena, those associated with saturation in dense gluonic matter. Saturated gluonic matter is a transient state, which, in collisions of hadrons or nuclei, eventually decays into a quark-gluon plasma. This transition is a new, far-from equilibrium, intrinsically quantum mechanical, and strongly coupled many body phenomenon that promises to reveal new effects that have not been seen in other systems to date. For example, it has been suggested that the decay of saturated gluonic matter into a gluon plasma seeds the formation of topological defects, which create an asymmetry in the handedness of produced quark-antiquark pairs. This handedness manifests itself in heavy ion collisions through interesting transport phenomena, known as “chiral magnetic effects.” Analogues of chiral magnetic effects have been discovered in condensed matter systems—for example, in the semi-metal ZrTe5, where they may lead to novel spintronic devices. The initial formation of topological defects in heavy ion collisions is related to the structure of the color field in saturated gluonic matter. Unraveling the structure of these fields is a central goal of an EIC, as described in detail in Chapter 2. The rapid pace at which novel topological materials are being developed suggests that the interaction between QCD and condensed matter physics will continue to be fruitful.
HERA has played essential roles in the development of QCD. The insights it has given in the gluonic content of the proton, for example, are integral to the physics program at the LHC. Where the LHC is now probing QCD at the energy frontier and challenging the limits of QCD calculations when the color interactions are weak, the EIC will provide essential connections to QCD in regimes that are inaccessible with such techniques. At the most basic level, the EIC will expand understanding of the gluonic content of the proton and extend it to nuclei, which is relevant to present and future high energy physics pursuits at colliders, with neutrinos, or in space. More broadly, the richness of QCD phenomena eludes explanation at present by means of first-principle calculations, and advances continue to require the interplay of experiment and theory. As much of the theoretical work to develop physics beyond the Standard Model centers on Yang-Mills theories, QCD plays a special role in nuclear and particle physics as the only Yang-Mills theory within the Standard Model that admits relativistic bound states.
One of the most interesting questions in astrophysics is the high-energy limit of our universe: What kinds of natural accelerators exist in nature, and what can be learned about astrophysical acceleration mechanisms by measuring the high-energy neutrinos, nucleons, and nuclei that reach Earth? In recent years, new kinds of astrophysical observatories have been constructed to answer such questions. The Pierre Auger Observatory,15 located in the Mendoza region of Argentina near the base of the Andes, was completed in 2008. The observatory detects the collisions of ultra-high-energy cosmic rays—energetic nucleons or nuclei—with atmospheric nuclei through the air showers that such collisions produce (Figure 6.2). The energy of the collision is dissipated in the atmosphere through the production of vast numbers of photons, electrons, and muons. As this particle shower travels from the upper atmosphere toward Earth, it causes the atmosphere to fluoresce. The ultraviolet light is recorded in the Pierre Auger Observatory’s array of fluorescence telescopes, which can detect showers originating from an area of the sky in excess of 1,000 square miles. In addition, the energetic secondary particles that reach Earth’s surface can be directly detected in the observatory’s array of water Cherenkov detectors.
The Pierre Auger Observatory has recorded over 50,000 ultra-high-energy events with E > 5 × 1018 eV, corresponding to TeV for proton primaries.
15 Pierre Auger Collaboration (A. Aab et al.), 2015, The Pierre Auger Cosmic Ray Observatory, Nucl. Instrum. Meth. A798:172.
More than 220 events have energies in excess of 5 × 1019 eV. This energy is close to what is known as the Greisen-Zatsepin-Kuzmin cutoff—the energy above which cosmic ray protons and nuclei can no longer propagate long distances, due to their interactions with the cosmic microwave background left over from the Big Bang. The high-energy events are only slightly perturbed by their passage through the galactic magnetic field, and thus can be correlated with possible point sources.16 Investigators have observed a change from a proton-dominated composition at a few times 1018 eV toward heavier nuclei as the energy increases. Moreover, taking benefit of their hybrid data, they found a ~30 percent excess of muons in extensive air showers with respect to shower simulations.17 More recently, they also reported large-scale anisotropies toward the nearby distribution of extragalactic matter.18
An important goal of Pierre Auger and other high-energy cosmic ray studies is to understand the composition of the cosmic rays as a function of energy, as noted above. The composition must be deduced from a comparison of specific properties of the observed air showers, such as mean depth of the shower maximum, and its
16 Pierre Auger Collaboration (A. Aab et al.), 2015, Searches for anisotropies in the arrival directions of the highest energy cosmic rays detected by the Pierre Auger Observatory, Astrophys. J. 804:15.
17 Pierre Auger Collaboration (A. Aab et al.), 2016, Testing hadronic interactions at ultrahigh energies with air showers measured by the Pierre Auger Observatory, Phys. Rev. Lett. 117:192001.
18 Pierre Auger Collaboration (A. Aab et al.), 2015, Large scale distribution of ultra high energy cosmic rays detected at the Pierre Auger Observatory with zenith angles up to 80°, Astrophys. J. 802:111.
dispersion, relative to expectations based on air shower simulations.19 Key input to the latter are hadronic interaction models tuned to describe scattering data from accelerators such as the Hadron-Electron Ring Accelerator and the Large Hadron Collider, which are then used in extrapolations to higher center-of-mass energies relevant to Pierre Auger. One of the specific difficulties in relating subtle changes in shower properties to evolving compositions is that the hadronic cross sections may be changing in an unexpected way—for example, because of the onset of saturation—making such extrapolations unreliable. Constraints from EIC data could help reduce the uncertainties in cosmic ray composition analyses.20,21
19 Pierre Auger Collaboration (A. Aab et al.), 2014, Depth of maximum of air-shower profiles at the Pierre Auger Observatory. I. Measurements at energies above 1017.8 eV, Phys. Rev. D90:122005.
20 L.A. Anchordoqui, A.M. Cooper-Sarkar, D. Hooper, and S. Sarkar, 2006, Probing low-x QCD with cosmic neutrinos at the Pierre Auger Observatory, Phys. Rev. D74:043008.
21 E.M. Henley and J. Jalilian-Marian, 2006, Ultrahigh energy neutrino-nucleon scattering and parton distributions at small x, Phys. Rev. D73:094004.