National Academies Press: OpenBook

Probability and Algorithms (1992)

Chapter: 12 Missing Pieces, Derandomization, and Concluding Remarks

« Previous: 11 Randomly Wired Multistage Networks
Suggested Citation:"12 Missing Pieces, Derandomization, and Concluding Remarks." National Research Council. 1992. Probability and Algorithms. Washington, DC: The National Academies Press. doi: 10.17226/2026.
×

12.
Missing Pieces, Derandomization, and Concluding Remarks

J. Michael Steele,1

University of Pennsylvania

No survey is ever complete, and completeness is especially elusive for a survey of a rapidly evolving subject like the interface of probability and the theory of algorithms. As we go to press, there is late-breaking news of stunning progress on transparent proof techniques, which, in a nutshell, are methods that (in some versions) allow one to test the validity of alleged proofs by applying tests that will fail with probability 1/2 unless the proof is valid. Some of the foundations that underlie this development are touched on in Chapters 4 and 5, but, even so, an honest sketch of the new ideas in transparent proof techniques would require a substantial excursion into the most modern bits of computational complexity theory. Rather than take on that excursion, we have to be content with referring the reader to the journalistic accounts of Kolata (1992) and Cipra (1992) and the more technical discussion of Johnson (1992). The latter contains pointers to the appropriate scientific literature.

If one studies the engineering of these recent advances in transparent proof techniques, one finds many structures that were first supported by probabilistic thinking. As it happens, one also finds that considerable effort was subsequently invested to replace most of the original probabilistic scaffolding by constructions that could be called purely deterministic. To probabilists, this passion for excising randomized constructions seems curious, but many computer scientists and combinatorists feel themselves to be

1  

 Research supported in part by NSF grant DMS88,-12868, AFOSR grant 89-0301; ARO grant DAAL03-89-G-0092, and NSA grant MDA-904-H-2034.

Suggested Citation:"12 Missing Pieces, Derandomization, and Concluding Remarks." National Research Council. 1992. Probability and Algorithms. Washington, DC: The National Academies Press. doi: 10.17226/2026.
×

on philosophically shaky ground with randomized constructions. Thus, in many contexts, such constructions are accorded only the status of a pure existence proof and, almost always, they are seen as lacking at least some of the pristine virtue of deterministic constructions.

The classical way to atone for the easy virtue of probabilistic constructions has been to supply deterministic replacements, and a famous illustration of this tradition arises in the discussion of expander graphs in Chapter 11. Though it is shockingly easy to show the existence of all sorts of expander graphs via probability, many researchers seemed to breathe a sigh of relief when deterministic constructions for good expanders were developed, even when the new constructions called on methods as sophisticated as those used by Lubotzky et al. (1988).

Although most of the discoveries provoked by the urge to replace a randomized construction with a deterministic one have turned out to be rather specialized, there are recent developments that change this situation and offer enticing suggestions of an important new theory of derandomization. Instead of pursuing clever ad hoc constructions, the new idea is to look for algorithmic procedures that replace the very steps employed in the randomized construction. This shift in perspective turns out to be very fruitful, and a good sense of its power can found in Chapter 15 of the important new book The Probabilistic Method, by Alon et al. (1992). Further, because of the remarkable utility of the Lovász Local Lemma in the more subtle probabilistic constructions, the recent algorithmization of the Local Lemma by Beck (1992) and Alon (1992) offers a compelling validation of the derandomization concept.

In addition to its useful discussion of derandomization, the volume of Alon et al. (1992) also provides charming introductory treatments of at least two other topics that may seem underrepresented in this survey, graph algorithms and random graphs. The latter topic is also well covered in the treatise Random Graphs by Bollobás (1985), which is a bible for any serious student of random graphs. From the probabilists' point of view, two of the most important recent links to these areas are to percolation theory and to the Chen-Stein method of Poisson approximation. Much of the recent progress in percolation theory is beautifully introduced by Grimmett (1989), and the recent treatise on Poisson approximation of Barbour et al. (1992) surely belongs on the bookshelf of anyone interested in probability, graphs, or discrete algorithms.

Another important sphere of probability in the service of algorithms that some may see as underrepresented here is the analytical probabilistic

Suggested Citation:"12 Missing Pieces, Derandomization, and Concluding Remarks." National Research Council. 1992. Probability and Algorithms. Washington, DC: The National Academies Press. doi: 10.17226/2026.
×

analysis of algorithms. This area can be characterized by the use of explicit combinatorial calculations and generating functions to calculate the means and variances of algorithm running times and other features of interest. Historically, the critical parts of such calculations tend to be more closely connected with real and complex analysis than with probability theory, but the language of probability always drives the problem formation and increasingly contributes to the analysis. Much of this tradition springs from seminal work of Donald Knuth, with many illustrations of the central themes found in his now classical books The Art of Computer Programming, vols. 1-3 (Knuth, 1973). A more recent and introductory treatment that is sympathetic in approach is the text of Bender and Williamson (1991), which also can be commended for the insights it offers into asymptotic analyses assisted by generating functions. Another recent volume that anyone involved with the analytical tradition should read is Will (1990), which christens ''generatingfunctionology" as a field in itself and also offers up many of the field's secrets in a way in which they can be enjoyably mastered.

A final volume that deserves mention here is the recent collection Probabilistic Combinatorics and Its Applications, edited by Bollobás (1991). The seven essays in this collection are all of great interest to the field, and each points toward many lively research topics. In particular, the essay by F.R.K. Chung (1991) provides quite another perspective on derandomization theory and illustrates many of the subtleties that perplex investigators who examine randomness in a quest to find acceptable surrogates.

References

Alon, N. (1992), A parallel algorithmic version of the Local Lemma, Rand. Struct. Algorithms 2, 367-377.

Alon, N., J. Spencer, and P. Erdös (1992), The Probabilistic Method , Wiley-Interscience Series in Discrete Optimization, John Wiley & Sons, New York.


Barbour, A.D., L. Hoist, and S. Janson (1992), Poisson Approximation , Oxford Studies in Probability no. 2, Clarendon Press, Oxford.

Suggested Citation:"12 Missing Pieces, Derandomization, and Concluding Remarks." National Research Council. 1992. Probability and Algorithms. Washington, DC: The National Academies Press. doi: 10.17226/2026.
×

Beck, J. (1992), An algorithmic approach to the Lovász Local Lemma I, Rand. Struct. Algorithms 2,343-366.

Bender, E.A., and S.G. Williamson (1991), Foundations of Applied Combinatorics, Addison-Wesley, Reading, Mass.

Bollobás, B. (1985), Random Graphs, Academic Press, New York.

Bollobás, B., ed. (1991), Probabilistic Combinatorics and Its Applications , Proceedings of Symposia in Applied Mathematics no. 44, American Mathematical Society, Providence, R.I.

Chung, F.R.K. (1991), Constructing random-like graphs, in Probabilistic Combinatorics and Its Applications, B. Bollobás, ed., American Mathematical Society, Providence, R.I., 21-56.

Cipra, B.A. (1992), Theoretical computer scientists develop transparent proof technique, SIAM News 25 (May), 1.


Graham, R.L., D.E. Knuth, and O. Patashnik (1989), Concrete Mathematics: A Foundation for Computer Science, Addison-Wesley, Reading, Mass.

Grimmett, G. (1989), Percolation, Springer-Verlag, New York.


Johnson, D.S. (1992), The NP-completeness column: An ongoing guide (23rd edition): The tale of the second prover, J. Algorithms 13 (to appear in September issue).


Knuth, D.E. (1973), The Art of Computer Programming, (vols. 1-3), Addison-Wesley, Reading, Mass.

Kolata, G. (1992), New short cut found for long proofs, New York Times, April 6, section C.


Lubotzky, A, R. Phillips, and P. Sarnak (1988), Ramanujan graphs, Combinatorica 8, 261-277.


Wilf, H.S. (1990), Generatingfunctionology, Academic Press, Boston.

Suggested Citation:"12 Missing Pieces, Derandomization, and Concluding Remarks." National Research Council. 1992. Probability and Algorithms. Washington, DC: The National Academies Press. doi: 10.17226/2026.
×
Page 175
Suggested Citation:"12 Missing Pieces, Derandomization, and Concluding Remarks." National Research Council. 1992. Probability and Algorithms. Washington, DC: The National Academies Press. doi: 10.17226/2026.
×
Page 176
Suggested Citation:"12 Missing Pieces, Derandomization, and Concluding Remarks." National Research Council. 1992. Probability and Algorithms. Washington, DC: The National Academies Press. doi: 10.17226/2026.
×
Page 177
Suggested Citation:"12 Missing Pieces, Derandomization, and Concluding Remarks." National Research Council. 1992. Probability and Algorithms. Washington, DC: The National Academies Press. doi: 10.17226/2026.
×
Page 178
Probability and Algorithms Get This Book
×
Buy Paperback | $50.00
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Some of the hardest computational problems have been successfully attacked through the use of probabilistic algorithms, which have an element of randomness to them. Concepts from the field of probability are also increasingly useful in analyzing the performance of algorithms, broadening our understanding beyond that provided by the worst-case or average-case analyses.

This book surveys both of these emerging areas on the interface of the mathematical sciences and computer science. It is designed to attract new researchers to this area and provide them with enough background to begin explorations of their own.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!