12. Missing Pieces,

Derandomization, and

Concluding Remarks

J. Michael Steele,' University of Pennsylvania

No survey is ever complete, and completeness is especially elusive for a survey of a rapidly evolving subject like the interface of probability and the theory of algorithms. As we go to press, there is latebreaking news of stunning progress on transparent proof techniques, which, in a nutshell, are methods that (in some versions) allow one to test the validity of alleged proofs by applying tests that will fail with probability 1/2 unless the proof is valid. Some of the foundations that underlie this development are touched on in Chapters 4 and 5, but, even so, an honest sketch of the new ideas in transparent proof techniques would require a substantial excursion into the most modern bits of computational complexity theory. Rather than take on that excursion, we have to be content with referring the reader to the journalistic accounts of Kolata (1992) and Cipra (1992) and the more technical discussion of Johnson (1992). The latter contains pointers to the appropriate scientific literature.

If one studies the engineering of these recent advances in transparent proof techniques, one finds many structures that were first supported by probabilistic thinking. As it happens, one also finds that considerable effort was subsequently invested to replace most of the original probabilistic scaflolding by constructions that could be called purely deterministic. To probabilists, this passion for excising randomized constructions seems curious, but many computer scientists and combinatorists feel themselves to be

1 Research supported in part by NSF grant DMS8812868, AFOSR grant 890301, ARO grant DAAL0389G0092, and NSA grant MDA904H2034.

175

176

on philosophically shaky ground with randomized constructions. Thus, in many contexts, such constructions axe accorded only the status of a pure existence proof and, almost always, they are seen as lacking at least some of the pristine virtue of deterministic constructions.

The classical way to atone for the easy virtue of probabilistic constructions has been to supply deterministic replacements, and a famous illustration of this tradition axises in the discussion of expander graphs in Chapter 11. Though it is shockingly easy to show the existence of all sorts of expander graphs via probability, many researchers seemed to breathe a sigh of relief when deterministic constructions for good expanders were developed, even when the new constructions called on methods as sophisticated as those used by Lubotzky et al. (1988).

Although most of the discoveries provoked by the urge to replace a randomized construction with a deterministic one have turned out to be rather specialized, there are recent developments that change this situation and offer enticing suggestions of an important new theory of derandomization. Instead of pursuing clever ad hoc constructions, the new idea is to look for algorithmic procedures that replace the very steps employed in the randomized construction. This shift in perspective turns out to be very fruitful, and a good sense of its power can found in Chapter 15 of the important new book The Probabilistic Method, by Alon et al. (1992). Further, because of the remarkable utility of the Loviisz Local Lemma in the more subtle probabilistic constructions, the recent algorithmization of the Local Lemma by Beck (1992) and Alon (1992) offers a compelling validation of the derandomization concept.

In addition to its useful discussion of derandomization, the volume of Alon et al. (1992) also provides charming introductory treatments of at least two other topics that may seem underrepresented in this survey, graph algorithms and random graphs. The latter topic is also well covered in the treatise Random Graphs by Bollob:is (1985), which is a bible for any serious student of random graphs. From the probabilists' point of view, two of the most important recent links to these areas are to percolation theory and to the ChenStein method of Poisson approximation. Much of the recent progress in percolation theory is beautifully introduced by Grimmett (1989), and the recent treatise on Poisson approximation of Barbour et al. (1992) surely belongs on the bookshelf of anyone interested in probability, graphs, or discrete algorithms.

Another important sphere of probability in the service of algorithms that some may see as underrepresented here is the analytical probabilistic

177

analysis of algorithms. This area can be characterized by the use of explicit combinatorial calculations and generating functions to calculate the means and variances of algorithm running times and other features of interest. Historically, the critical parts of such calculations tend to be more closely connected with real and complex analysis than with probability theory, but the language of probability always drives the problem formation and increasingly contributes to the analysis. Much of this tradition springs from seminal work of Dona.1d Knuth, with many illustrations of the central themes found in his now classical books The Art of Computer Programming, vols. 13 (Knuth, 1973). A more recent and introductory treatment that is sympathetic in approach is the text of Bender and Williamson (1991), which also can be commended for the insights it offers into asymptotic analyses assisted by generating functions. Another recent volume that anyone involved with the analytical tradition should read is Wilf (1990), which christens "generatingfunctionology" as a field in itself and also offers up many of the field's secrets in a way in which they can be enjoyably mastered.

A final volume that deserves mention here is the recent collection Probabilistic Combinatorics and Its Applications, edited by Bollob;is (1991). The seven essays in this collection are all of great interest to the field, and each points toward many lively research topics. In particular, the essay by F.R.K. Chung (1991) provides quite another perspective on derandomization theory and illustrates many of the subtleties that perplex investigators who examine randomness in a quest to find acceptable surrogates.

References

Alon, N. (1992), A parallel algorithmic version of the Local Lemma, Rand. Struct. Algorithms 2, 367377.

Alon, N., J. Spencer, and P. Erd6s (1992), The Probabilistic Method, WileyInterscience Series in Discrete Optimization, John Wiley & Sons, New York.

Baxbour, A.D., L. Holst, and S. Janson (1992), Poisson Approximation, Oxford Studies in Probability no. 2, Clarendon Press, Oxford.

178

Beck, J. (1992), An algorithmic approach to the Lov;isz Local Lemma I, Rand. Struct. Algorithms 2, 343366.

Bender, E.A., and S.G. Williamson (1991), Foundations of Applied Combinatorics, AddisonWesley, Reading, Mass.

Bollobis, B. (1985), Random Graphs, Academic Press, New York.

Bollob;is, B., ed. (1991), Probabilistic Combinatorics and Its Applications, Proceedings of Symposia in Applied Mathematics no. 44, American Mathematical Society, Providence, R.I.

Chung, F.R.K. (1991), Constructing randomlike graphs, in Probabilistic Combinatorics and Its Applications, B. BolloHs, ed., American Mathematical Society, Providence, R.I., 2156.

Cipra, B.A. (1992), Theoretical computer scientists develop transparent proof technique, SIAM News 25 (May), 1.

Graham, R.L., D.E. Knuth, and 0. Patashnik (1989), Concrete Mathematics: A Foundation for Computer Science, AddisonWesley, Reading, Mass.

Grimmett, G. (1989), Percolation, SpringerVerlag, New York.

Johnson, D.S. (1992),_ The NPcompleteness column: An ongoing guide (23rd edition): The tale of the second prover, J. Algorithms 13 (to appear in September issue).

Knuth, D.E. (1973), The Art of Computer Programming, (vols. 13), AddisonWesley, Reading, Mass.

Kolata, G. (1992), New short cut found for long proofs, New York Times, April 6, section C.

Lubotzky, A., R. Phillips, and P. Sarnak (1988), Ramanujan graphs, Combinatorica 8, 261277.

Wilf) H.S. (1990), Genera tingfunctionology, Academic Press, Boston.