Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 1
1 Ini;roc3 action Spatial statistics is concerned with the study of spatially referenced data and associated statistical moclels and processes. It is therefore relevant to most areas of scientific and technological inquiry. In addition, there are many problems that occur in subjects that are not overtly spatial, for example in speech recognition or in the construction of expert systems, that can be given useful spatial interpretations or can benefit in some other way from research in spatial statistics. Indeed, the abundance of application areas has meant that the task of the panel in preparing this report has been not only stimulating but also difficult, in that a limited number of topics had to be chosen for detailed discussion. The title of the report clearly implies that the panel places considerable emphasis on the relationship of spatial statistics to digital image analysis. This emphasis reflects the recent surge of interest among mathematicians and statisticians in this exciting area, which is destined to play an increas- ingly important role, not only in science and technology but in everyday life. For example, sequences of satellite images of regions of the Earth are now collected routinely. Each individual image is concerned! with only a small part of the Earth's surface and itself is subdivided into a rectangular array of picture elements or "pixels," typically 1024 x 1024. For every pixel several or many measurements are taken, each of which corresponds to a reflectance value in a particular range of the visible or near-visible electro- magnetic spectrum. The eventual aim might be to convert this vast quantity of two-dimensional, multivariate data into a simple crop inventory that can be usecI, for instance, to estimate the total potential winter-wheat harvest of a country. Satellite images are also used for other purposes, such as locating and monitoring the condition of rocket silos in foreign territories; here, there are analogies in computer vision, where object recognition is one of many
OCR for page 2
2 important tasks. Significantly, conceptually similar problems occur in (nu- clear) magnetic resonance imaging (MRT) of the brain and of other human organs, where it is required to produce tissue classifications (e.g., into white matter, gray matter, spinal fluid, and tumor) from multispectral clata. Magnetic resonance imaging, mentioned above, represents just one of several different imaging modalities in nuclear medicine. Other examples inclucle the CAT-scan, in which X-ray images taken from several different positions are combined to reconstruct views of cross-sections of the anatomy of a patient; positron emission tomography (PET) and single photon emis- sion computed tomography (SPECT), which are used to measure perfusion (blood} flow) and metabolic activity in specific organs; and ultrasonic imag- ing, for measuring reflective and refractive gradients, such as organ bound- aries, within the body. Here, we briefly describe SPECT, a low-cost tech- nique within the reach of most medical facilities, as opposed to PET, which requires an on-site cyclotron and is available only in roughly 100 hospitals worldwide. Of course, there is a price to pay: SPECT currentiv produces much cruller images. However, this inadequacy stems in part from poor use of underlying, well-understood physical principles and it is here that mathematical and statistical modeling can play a fundamental role. In SPECT, a patient is injected with a radiopharmaceutical that has been tagged with a radioactive isotope. The pharmaceutical is chosen for its propensity to concentrate in the organ of interest in a way that is re- lated to the particular phenomenon uncler study. The aim is to map the concentration of the pharmaceutical throughout the target region, usually on a sTice-by-sTice basis; time may also be a factor, as when different phases of a heart cycle are being monitored. SPECT relies on the radioactive decay of the isotooe. which causes photons to be emitted according to a Poisson . . . . .. . . .. . . . . . . . . process In space and time, with intensity at any particular location being proportional to the concentration of the pharmaceutical there. A bank of gamma cameras, usually in a 64 x 64 array, counts the photon emissions that reach it and, by repeating the procedure for typically 64 positions around the patient's body, data that correspond to 64 different projections are col- lected. Mathematical interest centers on how the 64 x 64 x 64 array of counts can be used to reconstruct an accurate estimate of the true intensity map, suitably cliscretizec3. Commercially available reconstruction methods are based on "filtered back projection" (FBP), a technique borrowed from transmission (e.g., X-ray) tomography. However, FBP is not appropriate to SPECT, because of the very low signal-to-noise ratio and the importance of non-uniform attenuation and (lepth-(lependent scatter and blur. These
OCR for page 3
3 forces combine to produce unsatisfactory reconstructions. At first sight, it might appear sufficient to build a proper physical model, in which the data are independent observations from Poisson random variables with means de- termined by a particular transform of the true intensity map. Unfortunately, the inverse problem of inferring (a discretized version of ~ the true intensities is too id-posed for this to provide a satisfactory solution. An additional regularization assumption must be made, which prevents the local behav- ior of the reconstruction from becoming too disjoint, yet does not impose undue smoothness on the image. The Bayesian solution to this dilemma is one of the topics tackled in chapter 2, but the basic idea is to specify a stochastic mode! for the true image that is at once globally flexible, yet locally constrained to produce severe discontinuities only when there is con- vincing evidence of their existence in the data. Incidentally, a somewhat similar problem occurs in the epidemiology of rare, noncommunicable dis- eases, such as particular forms of cancer, when incidence rates are observed over a specific period of time in a large number of contiguous administrative regions and the objective is to estimate underlying differences in risk. In each region, the number of cases can be viewed as an observation from a Poisson distribution, with mean proportional both to the population and to the risk there. When the means are small, the observed rates are very noisy and provide a poor measure of risk, so that some form of smoothing is required to produce a more readily interpretable map. Note that this prob- lem is simpler than SPECT in that it involves direct rather than indirect sensing and also the number of observations is much smaller. As a result, it is possible to implement computationally intensive methods of spatial sta- tistical analysis that are not yet feasible for genuine images. Such problems are therefore not only valuable in their own right but provide useful insights for the future. However, attention should also be paid to the origins of spatial statistics, as well as to its present and future. Perhaps the best known and most acces- sible among early examples is that of Dr. John Snow, a medical practitioner, who traced the precise source or cholera epidemic in central London in 1864 by plotting the locations of water pumps and of deaths from the disease on the same map (see Tufte 1983, p. 24~. Such simple graphical techniques are still very important, though it should be noted that they are often of little value in modern epidemiology because of variations in background population density. One could cite many other isolated examples, but it is probably fair to say that spatial statistics did not emerge as an identifiable discipline until 1960, with the publication by BertiT Matern of his doctoral
OCR for page 4
4 dissertation entitled Spatial Variation. Much of the material was well ahead of its time, although, remarkably enough, some of it had been completed as early as 1947. The treatise has recently been republished (Matern, 1986) and still provides much useful guidance, regarding both statistical theory and practice. However, it was not until the 1970s and l980s that spatial statistics began to receive widespread attention from other mathematicians and statisticians. Nonetheless, it may be said to have "come of age," as it now provides a major focus of contemporary research. Applications are many and varied and generate a steady stream of new problems. Historically, observational programs that use the analysis methods of spatial statistics (e.g., earth sciences, agriculture, and epidemiology) have been limited by sparse sampling. As recently as 20 years ago, for example, as few as 10 observations of sea surface temperature per day over a 200- km2 area of the ocean was considered state of the art. From a statistical point of view, the inadequacy of such sparse sampling in a domain of large spatial and temporal variation (such as in the ocean) was clear. Modern data acquisition methods (e.g., satellite observations from space) now have greatly circumvented this sampling limitation. Data rates as high as 106-107 bits per second are routinely achieved with this new technology. A similar situation exists in nuclear medicine. The overall result of this improvement in data acquisition is the development of ciata bases that provide a high spatial resolution and a synoptic realization of a given process uncler study. Such data bases are manageable, however, only because of contemporaneous advances in digital computer processing and mass storage/retrieval devices. Computer resources are also required to execute the large number of repetitive operations typically required in the application of a technique of spatial statistics or digital image analysis to a field of science. Advances in work station technology, data base management, data compression, and data archiving, coupled with the expansion of computer network topologies, now provide the necessary technical infrastructure for the (levelopment of joint university curricula in spatial statistics and digital image analysis and in its cross-disciplinary application of methods to a broad range of scientific, engineering, and medical problems. The vast amounts of data collected by satellites, radar, and sonar mea- surements needs to be organized and reduced in complexity. While statistics originally emphasized obtaining maximal information from minimal ciata, the challenge from these new data sources is to summarize eloquently and to increase understanding of enormous quantities of information. Pictures need to be sharpened, new summary measures need to be developed, and
OCR for page 5
5 different forms of storing, organizing, and retrieving information need to be implemented. The interface between statistics and computer science is particularly important here: data structures, such as geographic informa- tion systems (GTS), can help in both organization and display of spatially expressed data, and graphical tools that visually link overlaid data compo- nents are useful in detecting and exhibiting relationships. There are many open research problems in the area of visual data-analytic techniques. For example, how does one display the uncertainty connected with contour lines on a statistical map, and what is an effective way of displaying more than one spatially expressed variable? The remainder of this volume consists of 10 scientific chapters. Chap- ter 2 describes the Bayesian/spatial statistics framework in image analysis and computer vision. Particular attention is paid to image reconstruction. Chapter 3 addresses the application of non-Bayesian digital image analy- sis methods to oceanography and atmospheric science. Examples of image segmentation (i.e., cloud detection in complex natural scenes), near-surface velocity computation from image sequences, and ice boundary detection in satellite data are given. Chapter 4 applies methods of spatial statistics to a broad range of environmental science issues: spatial variation in solar ra- diation, environmental impact design, and modeling of precipitation using space-time point processes. Chapter 5 provides a basis for geostatistical analysis of earth science data. The variogram and kriging are then ex- ploited to study the flow of groundwater from a proposed nuclear waste site and the spatial distribution of acid rain over the eastern half of the United States. The uses of spatial statistics to analyze data from agricultural field experiments are explored in chapter 6. The objective of such analyses is to compare the effectiveness of different treatments (e.g., fertilizers) on a particular crop variety or to make comparisons between different varieties of the same crop but with a vaTicT assessment of error. Chapter 7 examines the traditional use of point process methods in ecology and analyzes some of the ~ . ... .. .. w ~ weaknesses In ants app~cat~ons area. Spatial statistics as a signal process- ing too! for radar and sonar systems in the ocean is the topic of chapter 8, while chapter 9 uses spatial statistics to examine chemical kinetics of active chemical systems. Chapter 10 provides a statistical basis for the field of stereology and discusses statistical modeling of stereological data. Finally, chapter 11 (1iscusses problems of speech recognition with the ultimate goal of enabling machines to emulate human speech. Although this report attempts a broad overview of main areas of spatial statistics and digital image analysis, there are many areas that we have not
OCR for page 6
6 covered. For example, spatial aspects of epidemiology have been used exten- sively in attempting to relate disease incidence to potential causes, but this application is not addressed in this report. Likewise, in astronomy, Neyman and Scott (1958) initiated the uses of the clustered point processes (cf. Chap- ter 7) to describe the distribution of galaxies and clusters of galaxies. Recent work on processing images of galaxies (Molina and Ripley, 1989) provides a substantial improvement over traditional (maximum entropy) methods in the area. Methods of digital image analysis have recently found applica- tion in population genetics to deal with complex pedigrees (Sheehan, 1989~. Sampling techniques for spatial data, emphasizing systematic designs, are reviewed in Ripley (1981, Ch. 3~. At first glance, this diversity in the applications of spatial statistics and digital image analysis may mask some of the underlying concepts common to most of the applications. Historically, spatial statistics and digital im- age analysis have tried to extract the most information from limited data sets. Modern data acquisition systems (e.g., remote-sensing of the Earth using satellites, nuclear medicine) now provide well-sampled spatial data. Hence, a relatively new role for spatial statistics and digital image analysis is to synthesize and reduce large volumes of data into manageable pieces of information. "Modeling," used in a most generic sense, is perhaps the most fundamental concept unifying the diverse applications base of modern spatial statistics. Models attempt to provide a coherent framework for the interpretation of complex data sets. Statistical models, which generally are noncausal in nature, draw conclusions about data sets without necessarily providing future predictive capability. Physical models, which generally in- clude time dependence, attempt to provide a prognostic capability about a physical process based on available data sets. It is likely that significant advances in science and engineering will be made by judiciously combining these two types of models. It is also important to note that some phenom- ena (and/or data sets) may not be amenable to modeling. In this case, spatial statistics attempts to develop the best representation of the data set from which the maximum statistically robust information can be extracted. Some of the methods found in these applications are (1) exploitation of lo- cal specification models (i.e., Markov random fields), (2) use of covariance estimation, and (3) the revolutionary role of the Gibbs Sampler in Bayesian statistics. Recent advances in computer science and computer technology have con- tributed significantly to the efficient and effective utilization of spatial statis- tics by other fields (e.g., engineering sciences). Present computer architec-
OCR for page 7
7 tures, however, are still far from ideal for several other classes of important problems in spatial statistics. For example, although massively parallel com- puters are applicable to some problems in spatial statistics and digital image analysis, current designs are not particularly appropriate to the important tomographic reconstruction problem. Modern advances in image display technology, visualization techniques (e.g., dithering), and the theory and implementation of compact, efficient data structures now allow scientists and engineers to store, retrieve, and display efficiently the large amounts of spatial data now being routinely recorded in such diverse fields as medicine, oceanography, and astronomy. Advances in spatial statistics will also be closely linked to these advances in computer science, especially within the subfield of data structures. Contin- ued research in data structures should be directed toward determining the most compact and efficient data structures for the storage and representa- tion of information relater! to problems in two-dimensional signal analysis and image analysis. Spatial statistics and digital image analysis will play important future roles in science and industry. For example, nondestructive evaluation (NDE) methods will be used more extensively by industry and government to per- form quality control and assurance on a spectrum of applications ranging from manufacturing of circuit boards to metal fatigue tests on airplane fuse- lages. Often such applications of NDE involve ultrasound detection and tomographic reconstruction, and/or two-dimensional signal processing and methods of digital image analysis. Methods of digital image analysis and image reconstruction also wiD be used to analyze the large-volume spatial data sets necessary to study a variety of issues (e.g., acid rain, ozone deple- tion, and global warming) related to quantitatively understanding climate and global change processes on planet Earth. Continued advances in ciata acquisition and digital image analysis wiD have a significant impact on such diverse fields a medicine and astronomy. At present, the United States has limited indigenous expertise in spa- tial statistics and its relation to modern methods of digital image analy- sis. Few university programs exist that properly accommodate the inherent cross-disciplinary nature of the field. The panel believes that careful con- sideration should be given to the development of joint curricula in spatial statistics and digital image analysis, which should accurately reflect their diverse applications in the fiel(ls of science, engineering, and meclicine.
OCR for page 8
8 Bibliography  Matern, B., Spatial Variation, Meddelanden fran Statens Skogsforstn- ingsinstitut 49. Second edition published as Springer Lecture Notes in Statistics, vol. 36, Springer, New York, 1986. t2] Molina, R., and B. D. Ripley, Using spatial models as priors in astro- nomical image analysis, '7. Appl. Stat. 16 (1989), 193-206.  Neyman, J., and E. L. Scott, Statistical approach to problems of cos- mology, J. R. Stat. Soc., B 20 (1958), 1-43.  Ripley, B. D., Spatial Statistics, John Wiley and Sons, New York, 1981.  Sheehan, to. A., Image Processing Procedures Applied! to the Estimation of Genotypes on Pedigrees, Technical Report No. 176, Department of Statistics, University of Washington, Seattle, 1989.  Tufte, E., The Visual Display of Quantitative Information, Graphics Press, Cheshire, Conn., 1983.
Representative terms from entire chapter: