National Academies Press: OpenBook
« Previous: ABSTRACT OF PRESENTATION
Suggested Citation:"TRANSCRIPT OF PRESENTATION." National Research Council. 2004. Statistical Analysis of Massive Data Streams: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/11098.
×
Page 92
Suggested Citation:"TRANSCRIPT OF PRESENTATION." National Research Council. 2004. Statistical Analysis of Massive Data Streams: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/11098.
×
Page 93
Suggested Citation:"TRANSCRIPT OF PRESENTATION." National Research Council. 2004. Statistical Analysis of Massive Data Streams: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/11098.
×
Page 94
Suggested Citation:"TRANSCRIPT OF PRESENTATION." National Research Council. 2004. Statistical Analysis of Massive Data Streams: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/11098.
×
Page 95
Suggested Citation:"TRANSCRIPT OF PRESENTATION." National Research Council. 2004. Statistical Analysis of Massive Data Streams: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/11098.
×
Page 96
Suggested Citation:"TRANSCRIPT OF PRESENTATION." National Research Council. 2004. Statistical Analysis of Massive Data Streams: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/11098.
×
Page 97
Suggested Citation:"TRANSCRIPT OF PRESENTATION." National Research Council. 2004. Statistical Analysis of Massive Data Streams: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/11098.
×
Page 98
Suggested Citation:"TRANSCRIPT OF PRESENTATION." National Research Council. 2004. Statistical Analysis of Massive Data Streams: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/11098.
×
Page 99
Suggested Citation:"TRANSCRIPT OF PRESENTATION." National Research Council. 2004. Statistical Analysis of Massive Data Streams: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/11098.
×
Page 100
Suggested Citation:"TRANSCRIPT OF PRESENTATION." National Research Council. 2004. Statistical Analysis of Massive Data Streams: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/11098.
×
Page 101
Suggested Citation:"TRANSCRIPT OF PRESENTATION." National Research Council. 2004. Statistical Analysis of Massive Data Streams: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/11098.
×
Page 102
Suggested Citation:"TRANSCRIPT OF PRESENTATION." National Research Council. 2004. Statistical Analysis of Massive Data Streams: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/11098.
×
Page 103
Suggested Citation:"TRANSCRIPT OF PRESENTATION." National Research Council. 2004. Statistical Analysis of Massive Data Streams: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/11098.
×
Page 104
Suggested Citation:"TRANSCRIPT OF PRESENTATION." National Research Council. 2004. Statistical Analysis of Massive Data Streams: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/11098.
×
Page 105
Suggested Citation:"TRANSCRIPT OF PRESENTATION." National Research Council. 2004. Statistical Analysis of Massive Data Streams: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/11098.
×
Page 106
Suggested Citation:"TRANSCRIPT OF PRESENTATION." National Research Council. 2004. Statistical Analysis of Massive Data Streams: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/11098.
×
Page 107
Suggested Citation:"TRANSCRIPT OF PRESENTATION." National Research Council. 2004. Statistical Analysis of Massive Data Streams: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/11098.
×
Page 108
Suggested Citation:"TRANSCRIPT OF PRESENTATION." National Research Council. 2004. Statistical Analysis of Massive Data Streams: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/11098.
×
Page 109
Suggested Citation:"TRANSCRIPT OF PRESENTATION." National Research Council. 2004. Statistical Analysis of Massive Data Streams: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/11098.
×
Page 110
Suggested Citation:"TRANSCRIPT OF PRESENTATION." National Research Council. 2004. Statistical Analysis of Massive Data Streams: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/11098.
×
Page 111
Suggested Citation:"TRANSCRIPT OF PRESENTATION." National Research Council. 2004. Statistical Analysis of Massive Data Streams: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/11098.
×
Page 112
Suggested Citation:"TRANSCRIPT OF PRESENTATION." National Research Council. 2004. Statistical Analysis of Massive Data Streams: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/11098.
×
Page 113

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

SOME CHALLENGES IN EXPERIMENTAL PARTICLE PHYSICS DATA STREAMS 92 TRANSCRIPT OF PRESENTATION MR. SCOTT: A decade ago, Steve Sain, who is sitting in the back corner here, was trying to write his thesis. These physicists kept bugging him to do these nonparametric methods, and finally found the top quark. Paul Padley has very graciously agreed to sort of give us a talk that really illustrates statistical challenges explicitly that are out there and, without any further ado, I will turn it over to Paul. MR. PADLEY: I am going to do my introduction, too. The most important thing I want you to leave with today is, I come from Rice, and it was ranked the coolest university in the United States by Seventeen magazine. While my daughters will challenge with me on that issue, who am I to argue with Seventeen magazine? So, I am doing particle physics which you just heard about, and my brief summary of what we tried to do is answer the following questions. What are the elementary constituents of matter, and what are the basic forces that control their behavior at the most basic level?

SOME CHALLENGES IN EXPERIMENTAL PARTICLE PHYSICS DATA STREAMS 93 In the process of doing that, we asked some very weighty questions. I mean that literally. So, one of the outstanding problems—and Bob may reference this—is this bit of the theory that is marked, not been checked, or not been confirmed. The bit of the theory that has not been confirmed is the thing that tells us why we have mass. I assume we have mass, you have mass. We don't know where that comes from. A point particle, the top quark, has the mass of 175 protons. A proton is a big, extended object. It has quarks and things in it. So, why is this point particle, that is infinitely small, have a mass? So, there are some pretty big, weighty questions that haven't been answered by the standard model. So, we don't know why things have mass. We don't have a quantum theory of gravity, which is some sort of clue, since gravity is the thing that interacts with mass. So, to do that, we do weighty experiments. Here is the next-generation experiment that we are building at CERN. It will weight 12,500 tons. Notice the person, for scale. You can think of this as a big instrumented device. Those 12,500 tons of detector are instrumented.

SOME CHALLENGES IN EXPERIMENTAL PARTICLE PHYSICS DATA STREAMS 94 To do this, we need big international collaborations. There is an experiment I currently work on at DO. There is a small subset of the people who worked on building that experiment, which is running and taking data now. It is one of the two experiments that discovered the top quark. A free beer if you can find me in there. I am there. Another free beer if you can figure out which client was upside down. You can think of a big particle physics detector as a smart device. It is a realtime smart device doing real-time analysis. The next-generation experiment will, in principle, produce data at about 40 terabytes of data a second. What we do is use this big smart device to self-select what data is interesting. We will knock it down to something like 800 gigabits per second in real-time on the detector itself. We never try to get the 40 terabytes of data off. I, for example, work on the bit out in the edge that tries to self identify whether what happened in there was interesting or not. I, for example, work on this bit of the detector which has artificial intelligence on it to say, “Hey, wow, something interesting happened here; let's think about it some more.” We then shift the data out in real-time computing, and we will reduce the data to something like 10 terabytes of data a day. Each step in that process is a statistical process.

SOME CHALLENGES IN EXPERIMENTAL PARTICLE PHYSICS DATA STREAMS 95 You can ask, given this heavy need to use statistics, one can ask the question, how do particle physicists use statistics? Bob made reference to the fact that we all use Gaussian statistics. In fact, if you went and polled a group of particle physicists and asked them, if two hypotheses have a good chi-squared, which hypothesis do you take, the vast majority would say, the one with the lower chi-squared, and not answer that both are valid. Your typical particle physicist, when you say, “Well, actually, both hypotheses are valid, what are you going to do with that?,” they will look at you dumbfounded. It was announced last year the observation of neutrinoless beta decay and they published it. They got it published. Okay, there is an effort to rectify this. Here is a headline from what I will call a trade magazine, the CERN Courier, a standard popular magazine about a business: “Physicists and Statisticians Get Technical in Durham.” There was a big conference in Durham last March on applying statistical techniques to particle and astrophysics data. How many statisticians even attended this conference? None. They then go on proudly to say, “this teaming of physicists and statisticians.” There they all are. Look for your colleagues in there. Almost 100 physicists and 2 professional statisticians gathered. That is a direct quote.

SOME CHALLENGES IN EXPERIMENTAL PARTICLE PHYSICS DATA STREAMS 96 Okay, we have a problem. Before I go on, some of us have actually tried to rectify this situation. I am going to talk about some work that has gone on, and I am representing other people who have done the real work. The results I will show come from my graduate student, Andrew Askew, and a former student at the Rice, Bruce Knuteson, a colleague, Hannu Miettinen, another colleague, Sherry Towers at Fermilab. Then, I owe a lot of thanks to David Scott, because he is always there and I can come and ask him my naive questions like, “You mean you don't take the lowest chisquared?” He will patiently answer them for me. First, a little bit of what we do. We talked about how we collide protons, for example, and look at what comes pouring out into our detector event by event, and then we statistically analyze that looking for new physics or known physics. It is a little bit like taking two strawberries, smashing them together. You get this bowl of energy that then turns into a bowl of fruit, and we look at the fruit coming out.

SOME CHALLENGES IN EXPERIMENTAL PARTICLE PHYSICS DATA STREAMS 97 Here is a specific example. This is something where you get a trip to Sweden. This would be a proton antiproton— proton proton collision, say the two photons. What we do, this is what we see in the detector, and we have to statistically analyze that to figure out what went on. Then we will make an invariant mass distribution of these two photon signals over a large statistical event to look for a bump. That would be the discovery of the Higgs signal. Of course, this phrase, “select all the events,” with the correct apologies, that is a very complex step. So, here is an event. What I have done is altered the detector outlook and energy deposited in the detector.

SOME CHALLENGES IN EXPERIMENTAL PARTICLE PHYSICS DATA STREAMS 98 There are still obvious big things there with missing energy and an electron. Then, as I look at the event I find, well, there is a jet from the fundamental quark, and another one and another one and another one. Those things get pretty hard to fish out in the data. So, we need pretty good tools for doing that. So, we have sort of a two-step problem. We look at each event and try to figure out, event by event, what has happened, and that is a complex pattern recognition problem with lots of cluster finding and other things that we need to do, to track fitting and identifying things. Then we have to take a cohort of events and statistically analyze them and look for physics. It gets pretty complex. In the next-generation experiment, what you will actually see, here is an event with four muons coming out. If you were looking at it, all you would see is that, and you have to find the tracks in that mess. I have made it worse than it really is, because I have compressed it all into two dimensions, a three-dimensional thing.

SOME CHALLENGES IN EXPERIMENTAL PARTICLE PHYSICS DATA STREAMS 99 It is not just those bumps we look for. Sometimes, what we are looking for are differences from the standard model that are a little bit more subtle. For example, this is a simulation distribution of two parameters—it doesn't matter what—for our standard model particle physics. Here is simulated distribution, assuming there were extra dimensions to the universe. The fact that we could use particle physics experiments to look for extra dimensions was mind-boggling to me, not something I had expected. Here is what we saw in the data. Then we have to ask the question, well, are we seeing extra dimensions in that data or not. Clearly, we are not in this case, but sometimes it is a little bit more subtle. So, I want to talk about two particular attempts to try and go a little bit beyond the sort of analysis that Bob was talking about. He made reference there about using neural networks, and we have been trying to use kernel density estimation. Then, a method for looking for new unexpected things.

SOME CHALLENGES IN EXPERIMENTAL PARTICLE PHYSICS DATA STREAMS 100 Neural networks have become commonly used as a pattern recognition tool, but the black box nature of them worries many in the field. I mean, physicists don't really like being able to visualize what they have seen in that. So, a group at Rice, named here—is Sain here? There he is. I have never met him, this happened before I got at Rice. They developed a method called PDE that was formulated to look for top quarks. It is a multivariate approach where you can plot things to understand what is going on. So you want to form a discriminate function that can discriminate signal from background. So, you can have a general form like this, where x is some vector of parameters that you have measured. So, you have a function describing the signal and

SOME CHALLENGES IN EXPERIMENTAL PARTICLE PHYSICS DATA STREAMS 101 then the background, and you try to make it discriminate. So, we need to formulate these feature functions. What they did was they used kernel estimation. So, for each data point in the signal or background, they put in a function, typically a Gaussian function—actually, we have only ever used Gaussian functions. This function describes a surface and end dimensions, and you form the signal and background surfaces using this Monte Carlo simulated data of the signal you are looking for in the background. So, you can just think of these as smooth surfaces representing the data. It is much more straightforward to think of than thinking about what the neural network is doing.

SOME CHALLENGES IN EXPERIMENTAL PARTICLE PHYSICS DATA STREAMS 102 It reflects the covariance of the data. You construct a transformation, so that one class of the data has a covariance matrix that is a unit matrix, and the other is diagonal. That is something that you can do, and then you write it in mathematical form. There is a free parameter that enters. So, by following a recipe where we make these kernel functions, it can make a discriminate function and make a graphical cut, something where you can visualize what you are doing. Here is just a Monte Carlo, arbitrary parameter signal, that we wanted to look for

SOME CHALLENGES IN EXPERIMENTAL PARTICLE PHYSICS DATA STREAMS 103 distribution, and here is a background. Then, you apply this technique and you get a model of the signal, and a model of the background, and then here is this discriminate function, and you would make a cut somewhere here, and then be able to pick the signal out.

SOME CHALLENGES IN EXPERIMENTAL PARTICLE PHYSICS DATA STREAMS 104 We have modified that original method a little bit. One of the problems with that method is if you have outliers. The outliers sort of get a little Gaussian put around them. So, we have made a way to adapt that and smooth it out. So, we have introduced another parameter. Our colleague at Fermilab, Sherry Towers, independently came up with a similar method that uses a local covariance method. Here is a comparison of these methods for particularly hard signal and background samples, where a neural network just does a terrible job and our methods do a good job. So, that is one thing. So, one thing that we have tried to do is, in applying these

SOME CHALLENGES IN EXPERIMENTAL PARTICLE PHYSICS DATA STREAMS 105 cuts and sort of making cuts and boxes, as was described before, try and use—a lot of people are using neural networks, but there is a group of people who are trying other techniques, like support vector machines, internal density estimations and that. Another problem we come up with continuously is, you have background, and here, background means the standard model. What you see in the data is this. So, have we found new physics in these points out here or have we not? That is a common problem. In fact, if you were to take a poll, as a particle physics community, as to what the new physics would be, you would get a whole pile of answers. So, we have this standard model that works beautifully and explains all the known data. I think if you made a poll of the particle physics group, nobody would believe that the standard model is right. There are all these free parameters, you don't really understand mass in this context, and there are a lot of ideas out there for what the new physics would be, new bosons, all these things that are sort of meaningless. A big contingent would say something else. So, this was just a straw poll then.

SOME CHALLENGES IN EXPERIMENTAL PARTICLE PHYSICS DATA STREAMS 106 Even if you pick the most popular model, supersymmetry, of what you are looking for, there are 105 pre- parameters in the theory. So, how do you possibly know — and changing those parameters changes what you will see in the detector. So, the problem, in a nutshell, that we face is that we have a well-defined standard model. We need to look for interesting physics in the data—there is lots of data. We need to statistically analyze these events to determine if we have found some physics of interest, but we probably don't know what the physics of interest is that we are looking for. So, we are looking for something unknown in this vast mass of data. Now, the method that was described to you before is, typically, what you do is select a model to be detected, you find a measurable prediction and a couple of

SOME CHALLENGES IN EXPERIMENTAL PARTICLE PHYSICS DATA STREAMS 107 parameters of that model, and you go check those predictions against the data. That is fine, when you have a finite set of models to test. In this huge plethora of possible models and every variation of the parameters, and saying supersymmetry is a different model with different consequences for the experiment, this becomes a real problem. So, at the DO experiment, a generic search was tried. This is something that was called Sleuth. So, typically, what is done in a physics analysis is that you have some class of model, minimal supergravity, supersymmetric supergravity, with some particular set of parameters, and you can go and try and look for that case. You can consider looking for some larger set of the parameters. What we really want to do is try to make our searches where we are looking for something new in general. So, the typical physics analysis done in a particle physics experiment is done at 1.5 on this scale, and we would like to be up here at 6.0, searching through the data, looking for new things. The other problem that comes up all the time is you get one unique event. Well, Bob showed you a unique event before. Well, how do you find those unique events and decide that they are unique. So, we would like our method to be able to do that in an unbiased way.

SOME CHALLENGES IN EXPERIMENTAL PARTICLE PHYSICS DATA STREAMS 108 So, what we did is basically tried to boil the data down into a finite set of fundamental things that you can measure. We looked at different combinations of the particles, what they are, and then we tried to see what it is about those particles that we are measuring that is interesting. In our case, we have the advantage that the known physics happens at low energy. So, at low transverse energy in the experiments. So, if we look for things that happen at high energy, they are likely to be interesting. So, we picked a set of parameters which is just basically the momentum in the perpendicular direction of the things that we were looking for.

SOME CHALLENGES IN EXPERIMENTAL PARTICLE PHYSICS DATA STREAMS 109 Then we go one more step. We do a variable transform. So, say we have our data gathered on a space like this. We basically push that data out into a unit box, and then map the data onto a uniform grid. We are taking a simulated model background data, and we map it out onto a uniform grid in a unit box, and dimensions. We can go back to that example that I gave before and look at what happens to the signal or not.

SOME CHALLENGES IN EXPERIMENTAL PARTICLE PHYSICS DATA STREAMS 110 The way we set up our variables, it would tend to push those events up into a cluster in the corner. Then we have to ask the question, is that an interesting cluster that we are looking at? So, what we do is, we create what are called Voronoi regions, which is just a cluster of data points as the set that has—the region is the set of all values of x closer to a data point in that cluster than to any other data point in the sample. So, you break the data up into regions and then you look through those regions and try to look for an x set in that space.

SOME CHALLENGES IN EXPERIMENTAL PARTICLE PHYSICS DATA STREAMS 111 Basically, you can assign a probability as, what is the probability that you will see something that is more interesting than what you saw. So, the data contain no new physics. These are just fine P's that are random between zero and one. If the data does contain new physics, then you hopefully find P to be small. This method was tested on known physics. For example, the search for the top quark, which was a big discovery, was reproduced using this. It did find—this message did find the top quark, but the price you pay for this general search is that you have less sensitivity to the new thing that you are looking for. What was amazing with this is, here is a list of all of the channels and the limits we could set, looking for new physics in all these channels. In the traditional sort of

SOME CHALLENGES IN EXPERIMENTAL PARTICLE PHYSICS DATA STREAMS 112 analysis that Bob was describing, and normally done, is a graduate student would come and pick a channel and take a model, and test that channel against some model. With this method here, we were able to search down through a whole list of channels in one fell swoop, going through our masses of data. Now, one of the things that made this possible is, at the time of this analysis, it was mature data. So, we had very good simulations of the detector in that. So, we really could model the standard model in it very well. It was very mature, well-understood data. So, that made it easier to go searching through. We think in here, there is this hint where we have this general problem of looking for unknown things. So, just to conclude, I think particle physics presents a number of challenges of interest. You have just seen a little taste, a little scratch of the many places, every step along the way, we face statistical challenges. We certainly have large streams of data that we must search in real-time, and offline, for signals of interest, that are possibly—in fact, the most interesting ones would be unanticipated. We have the advantage of a well-defined standard model to test against, and actually, techniques to generate the data that we use for that will be talked about in the next talk. There are people in the community who have actually talked to a statistician now. That is a step in the right direction. Of course, we always have this hurdle, we know we are smarter than everybody else, so we try to reinvent things for ourselves from scratch. There is a small group of people who have actually spoken to at least two statisticians. So, we know it happens. So, we are in the very early infant stages. I have shown some of what I call baby steps that have gone on at our experiment at DO which are unique. I mean, the number of people even within the experiment who would understand the phrase kernel density estimation would be very small, or even know what a kernel is. So, there is a lot of progress that needs to be made. The statistics conference that occurred is going to be a continuing series, it looks like. There is another one scheduled for next year. So, there is at least a recognition in the community that we should be talking to the statistics community, and that dialogue should start. Hopefully, we will get smarter about how we do analysis. I will finish there. AUDIENCE: [Question off microphone.] MR. PADLEY: Yes, I think—that has to be done individually for each combination of particles. In fact, really, you don't need to do it. I mean, really, if you are smart, you could bypass that whole step. I think next time around methods will be used to bypass that. It is the idea of trying to find—the problem you always get is you have

SOME CHALLENGES IN EXPERIMENTAL PARTICLE PHYSICS DATA STREAMS 113 some distribution that is fine with exponential pay off. You have two events way out here, far away from the standard model. So, is that interesting? I mean, Carlo Rubbia made a career out of discovering things out there that, as they took more data, the distribution— [off microphone.] MR. SCOTT: I have a question, Paul. You have 6, 7, 10 dimensions of data. Are these real measurements or derived measurements, typically? MR. PADLEY: That is almost level three in this case, which is one of the problems. So, you know, part of what made that analysis possible at the end is, what we started off with is about a million measurements per event. You then have to distill that down into, say, vectors, and that will represent—there will be hundreds of those in an event. We have tried to knock it down to five or six or ten parameters that characterize the events. That is like level three. That was really only possible because of the maturity of the data and, by the time that analysis was done, there was a lot of confidence in the other steps that went into informing that data set.

Next: Miron Livny Data Grids (or, A Distributed Computing View of High Energy Physics) »
Statistical Analysis of Massive Data Streams: Proceedings of a Workshop Get This Book
×
 Statistical Analysis of Massive Data Streams: Proceedings of a Workshop
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Massive data streams, large quantities of data that arrive continuously, are becoming increasingly commonplace in many areas of science and technology. Consequently development of analytical methods for such streams is of growing importance. To address this issue, the National Security Agency asked the NRC to hold a workshop to explore methods for analysis of streams of data so as to stimulate progress in the field. This report presents the results of that workshop. It provides presentations that focused on five different research areas where massive data streams are present: atmospheric and meteorological data; high-energy physics; integrated data systems; network traffic; and mining commercial data streams. The goals of the report are to improve communication among researchers in the field and to increase relevant statistical science activity.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!