inferred. Early in the 1900s, over-the-horizon radio transmissions proved the ionosphere's existence. The alignment of auroral forms with Earth's magnetic field and the correlation of auroral activity with solar activity deduced in the late 19th century were the first real indications of the existence of a magnetosphere and its control, in some unknown manner, by the Sun. In 1930 Sidney Chapman and A.C. Ferraro developed the first real model of the magnetosphere. They postulated that a plasma (which we now called the solar wind) was emitted intermittently to cause a compression of Earth's magnetic field and a resultant magnetic cavity in the flowing solar wind (which we now call the magnetosphere). By the mid-1950s the plasma density of the magnetosphere had been probed remotely using lightning-generated, very-low-frequency waves. Nevertheless, the beginning of space physics as a discipline was marked by the discovery of the Van Allen radiation belt and the in situ exploration of Earth's magnetosphere and its interaction with the solar wind made possible with the advent of rockets and satellites in the late 1950s and 1960s.

The data gathered in space-based observations are collected by sets of 3 to perhaps 10 different sensors positioned on Earth-orbiting satellites or interplanetary probes. A typical satellite might carry a magnetometer to detect slowly changing magnetic fields and a separate sensor to measure magnetic oscillations; a device to record electric fields and waves; plasma analyzers (often a coordinated set of sensors) to measure the fluxes of charged particles as functions of their mass, energy, and direction of motion; and one or more sensors to measure high-energy charged particles. The data consist of time sequences of the sensors' outputs. When possible, similar or complementary data are collected by other satellites in different locations in space.

In addition, data gathered simultaneously from ground-based instruments are used to examine phenomena such as disturbances in the geomagnetic field (as detected by arrays of ground-based magnetometers), changes in the ionospheric density (as indicated by radar, riometer, and rocket-based observations), enhancements in atmospheric emissions signifying excitation by energetic particles (as shown in images from photometers and all-sky cameras), and activity on the Sun (as shown by, e.g., coronagraphs). Data on particles and fields are now often augmented with images of Earth's atmosphere in the visible, ultraviolet, or x-ray regions of the electromagnetic spectrum, thus enabling space physicists to relate observed high-altitude phenomena to ground-based observations.

The data matrices produced by space- and ground-based instruments thus include many different kinds of measurements often taken at widely separated sites, but often with good (but differing) time resolution. A challenge in analyzing and interpreting the data is to combine and compare them so as to deduce a global picture of the behavior of the magnetospheric system. Combining and analyzing these various data sets and types both require extensive electronic communications, including electronic mail and network transfer of data and text files, between the home institutions (sometimes international) of the many investigators involved.

Methods and Technologies for Data Analysis

Space plasma physicists use a combination of analytical and numerical methods to interpret and understand data, as well as to assist in the planning of observational campaigns. These methods can range from the use of simple linearized equations to model the early time evolution of plasma waves to the use of extensive multidimensional codes that attempt to simulate the full complexity of these nonlinear systems. The simulations can consist of calculations of the locations and motions of many millions of individual charged particles, together with the electric and magnetic fields that they generate, or they can be solutions of simultaneous partial differential equations that describe the plasma as a ''fluid.'' Running a typical simulation requires the use of a supercomputer, although some of the newer workstations can now run some of the smaller codes. The amount of numerical "data" generated is so great that if one were to attempt to keep it all, it would far exceed the capacity of even the largest computers. Even after pruning, the files that are kept necessitate good (reliable and fast) network communications between the

The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement