RANDOM WALKS WHOSE CONCAVE MAJORANTS
OFTEN HAVE FEW FACES

WIHUA QIAO and J. MICHAEL STEELE

ABSTRACT. We construct a continuous distribution C such that the number of faces in the smallest concave majorant of the random walk with Gdistributed summands will take on each natural number infinitely often with probability one. This investigation is motivated by the fact that the number of faces F. of the concave majorant of the random walk at time n has the same distribution as the number of records Rn in the sequence of summands up to time n. Since R~ is almost surely asymptotic to logn, the construction shows that despite the equality of all of the onedimensional marginals, the almost sure behaviors of the sequences {R~} and {F~} may be radically different.

1. INTRODUCTION

If Xi, i = 1,2.... is a sequence of independent random variables with a continuous distribution G, then the number of records

Rn = max{k : Xi, < Xi2 < ... < Xik~ 1 < '1 < i2 < ... < ik :5 n} was studied in R6nyi (1962) and was found to have the same distribution as

(1) ~1 + ~2 + ' .. + ~n
where {~i : i = 1, 2 .... } is a sequence of independent Bernoulli random variable that satisfy P(~k = 1) = 11k. Goldie (1989) later observed that as a consequence of Spitzer's combinatorial lemma as generalized by Brunk (1964) that the number of faces of the concave majorant of the random walk Sk = X1 + X2 + ''' + Xk, 1 < k < n, also has the same distribution as R,; that is, if one lets Fn denote the number of pieces in the smallest piecewise linear concave majorant of the set of points S,, = { (0, 0), (1, S,), . . . , (n, Sn) }, then one has P(Rn < t) = P(Fn :5 t) for all t E R and all integers 1 < n < oo.

By a standard BorelCantelli argument, one finds from the Bernoulli sum representation (1) of R., and the monotonicity Rn < R,+1 that

(2) lim Rn / log n = 1 with probability 1,
n~

so from Coldie's observation that Rn 1 Fn for each n, one might expect an analogous strong law for the sequence {Fn : n = 1,2,...}, despite the fact that the sequence {Fn : n = 1, 2, ...} is not monotone. In Steele (2002) it was suggested the

Date: June 1, 2002. 1991 Mathematics Subject Classification. Primary: 60DO5; Secondary: 60C05,68U05,62G99. Key words and phrases. Spitzer's combinatorial lemma, random walk, convex hull,convox minorant, concave majorant.

2 ZHIHUA QIAO and J. MICHAEL STEELE

7

6

5

4

3

2

0
0 2 4 6 8 10

FIGURE 1. The concave majorant of a random walk

limit law for records (2) might not extend to the face process {Fn n 1, 2, and the main goal of this note is to confirm a particularly strong version of this conjecture.

Theorem 1. There exists a continuous distribution function G such that if the random variables Xi, i = 1, 2.... are independent with P(Xi :5 x) = G(x) for all i = 1, 2, ... and if Fn denotes the number of faces of the least concave majorant of the first n steps of the random walk Sk, 1 < k < n, then

P(F,, = m i.o.) = 1 for each value of m = 1, 2,....

In particular, the concave majorant of the set S,, C R' will be a single line infinitely often with probability one.

The behavior described by tile relation (3) contrasts about as sharply with the limit (2) as one could imagine, despite the fact that the marginal distributions of Fn and Rn are equal for each n.

2. Two CONSTRUCTIVE LEMMAS

The basic idea is that one can construct a continuous distribution G such that infinitely often the summand Xi is so large that it completely overwhelms the cumulative contributions of all of the preceding summands. The implementation of this idea rests on two simple technical lemmas. To begin, we take an arbitrary sequence of integers 2 < n, < n2 < ''' , and consider independent discrete random variables Yi, i = 1, 2,. * * such that P(Yj = nk) = Pk for all i > 1 and k > 1. Our technical lemmas tell us about events that are unlikely to take place in conjunction with the first time that an element of the sequence {Yi} is equal to nk.

RANDOM WALKS AND CONCAVE MAJORANTS 3

Lemma 1. For a > 1, let c,,, be chosen such that Pk = Ca (k!)' is a probability measure on {1, 2,...}. If one sets

Bk,t {Yt = nk} n {Y, 54 nk for all 1 < s < t} n {maxY, > nk},
sthen for Ak U', Bk,t one has P(Ak i.O.) = 0.
t=
= Ek1
Proof. If we set ak j=1 PiAl Pk), then by independence of the {Yk} we have
at1).
P(Bk,t) = Pk(l  PJ1(1 k
Since the events {Bk,t}t'=1 are disjoint, we also have
00 00 00 00 1
Y~ P(Ak) E E P(Bk,t) 1: A (  _)
k=l k=l t=l k=l A (1  Mak
k
j=1 Pi~ j=k+l Pi
k 1
k=l Ei =1 pi k=l Ejw=k Pi

00
< E  < 00,
, (k + 1)a
k=l
and the proof is completed by applying the BorelCantelli Lemma.

One should note that the condition ce > 1 cannot be dropped. In particular, one can easily cheek that if Pk = elk!, then the sum of the P(Ak) diverges.

Lemma 2. Suppose the sequence {nk} satisfies the gap condition nk/nk1 > k/Pk for all k > 2. If we have

Ek,t = {Yt = nk} n {Y, < nk, for all 1 < s < t} and Ek = Ut>n,/n,_,Ek,t, then we have P(Ek i.o.) = 0.

Proof. We have P(Ek,t) = Pk(l:k1 Pi)t,' so by disjointness we find
i=1

00 0C 0C k1 0G k 1 2k
E P(Ek) E E Pk(EPi)tl = EK G=_1 Pi) nki
zk1
k=l k=lt=r "k 1 i~1 k=l pi
k1
00 r 00
(1  Pk) (kIpk)1
K < Pk)
k=l Pk k=l
Rom the bound 1 x < e' and the geometric sum, one sees the last sum is not larger than (1e')', so the BorelCantelli Lemma again completes the proof. E

3. PROOF OF THEOREM 1

We now say that a random time r is good provided that
r > 2, Yj < Y, for all 1 < j < T, and for nk such that Y, = nk one has 7. < nk/nk1,
so, the main point of Lemmas 1 and 2 is that they immediately imply that with probability one there exists an infinite sequence rl < 72 < ... of good times. To complete the proof of Theorem 1 we just need to connect the existence of these good times to the geometry of the concave majorant of an appropriate random walk.

4 ZIEHUA QIAO and J. MICHAEL STEELE

Specifically, we let U,,, n = 1, 2, ... denote a sequence of independent random variables with the uniform distribution on (0, 1), and we set Xn = Yn + U,,, where the Yn are as before. We will now focus on the random walk Sn = X1 +X2++Xn and confirm that the continuous distribution G(x) = P(Xk :~ x) satisfies the claims of Theorem 1.
The key geometric fact of the construction, fits snugly into a single line:
(4) if 7. is a good time, then F, = 1.
To see why this assertion is true, we first note that

(5) SI. > nk + Cr  1)n, > nk1 + nj  nift > nk1 +

where in the last step we use the facts that nj ~~ 2 and r > nj. We then note that for all t < r we have

(6) st < t(nk1 + 1) = nk1 +
t t
and the truth of the assertion (4) follows immediately from the bounds (5) and (6).
By assertion (4) and the almost sure existence of an infinite sequence of good times, we therefore find that Fn = 1 infinitely often with probability one. Now we only need to cheek that for each m > 1 we also have Fn = m infinitely often with probability one. The basic idea here is that we get infinitely many independent tries at an event that has probability that is uniformly bounded away from zero.
More formally, since the summands {Xk = Yk+Uk : k = 1, 2,...} are nonnegative, elementary geometry tells us that for each good time r and for each m > 2 that the event
S,17 > X,+1 > X,+2 > ... >
implies the event F,+., = m. Also, the event Xr+1 > X,+2 > > X,+., has probability greater than that of the event

Cm = {Y,+1 = Y,+2 = ... = Y,+.1 = nl} n {U,+, > U,+2 > ... > U,+m1},
and Cm has probability pmll(Tn  1)! = 6m > 0, which does not depend on r. Moreover, by (5) one always has S,17 > nj + 1 provided that Y, ~~ n2, so along the infinite sequence of good times rj, j = 2, 3,... one can find has infinitely many opportunities of observing F,+m_1 = m that are independent and that have probability 6m > 0. Thus, by the law of large numbers one finds that with probability one, we have & = m for infinitely many n, and the proof of Theorem 1 is complete.

4. A FINAL OBSERVATION

The distribution G constructed here suffices to show that one cannot expect regular behavior of Fn at the same level of generality that one finds regularity for Rn. Nevertheless, the sequence Fn may not always be badly behaved. Under nice conditions  say, for example, when the summands are exponentially distributed  one may be able to prove a useful limit law for F,.
In the hunt for such a law, it may be useful to note that for summands with a continuous distribution one always has

(7) lim sup F. > 1 with probability one.
n~ logn

RANDOM WALKS AND CONCAVE MAJORANTS 5

Moreover, from the equality of the distributions of R. and F,, and the representation (1) for R,,, one has the useful large deviation bound:

(8) P(IF.  H,,l ~~ cH.) :5 2exp(c2H,/4) for all 0 < c < 1/2,
where & = 1+1/2+1/3+...+1./n. In fact, the bound (8) follows immediately from the usual concentration inequalities for Bernoulli sums, say, for example, Bennet's inequality (Bennett (1962), equation (8b)). Finally, from (8) one easily proves (7) with the BorelCantelli lemma and a subsequence argument.

REFERENCES

G. Bennett, Probability inequalities for sums of independent random variables, J. Amer. Statist. Soc., 57 (1962), 3345.

H.D. Brunk, A Generalization of Spitzer's Combinatorial Lemma, Z. Wahrscheinlichkeitstheorie, 2 (1964), 395405.

C.M. Goldie, Records, permutations and greatest convex minorants, Math. Proc. Comb. Phil. Soc., 106 (1989), 169177.

A. R6nyi, On the extreme elements of observations, MTA III Oszt. K5zl, 12 (1962), 105121. (also, available in Selected Papers of Alfr6d R6nyi, Vol. 3, 5066, P. TurAn, editor, Akaderniai Kaid6, Budapest, 1976)

F. Spitzer, A combinatorial lemma and its application to probability theory, 7~ans. Amer. Math. Soc., 82 (1956), 323339

1M. Steele, The BohnenblustSpitzer Algorithm and Its Applications, Journal of Computational and Applied Mathematics, 142 (2002), 235249.

DEPARTMENT OF STATISTICS, WHARTON SCHOOL, UNIVERSITY OF PENNSYLVANIA, STEINBERC HALLDIETRICH HALL 3000, UNIVERSITY OF PENNSYLVANIA, PHILADELPHIA PA 19104