## Course Blog Fall 2014

## Grades Have Been Posed

Have a safe and happy holiday. See you next year!

## Day 26 Backwards Martingales---and a Look Back

We'll finish a theorem begun last time on square integrable martingales, then we'll look at backward martingales. These have many miraculous applications, but we'll just look at one --- we'll get a new proof of the adult strength SLLN. We'll then look back at what we've done this semester; we'll get a big picture mind map. Finally if there is time, we'll look at one more martingale application. They are like potato chips.

## Day 25 Square Integrable Martingales and Doob's Decomposition

We observed that if you square a martingale, you get a submartingale with some special properties. In particular it can be written as the sum of a new martingale and a nonnegative, nondecreasing, anticipating process A_n. This is the Doob decomposition, and it has some very useful properties. In particular, A_n can be used to build some useful stopping times. We saw that this process relates nicely to the Doob maximal inequality and we proved yet another martingale convergence theorem (cf Durrett's 5.4.8 and 5.4.9). Everyone should read the rest of Section 5.4; in particular Theorem 5.4.10 is well worth understanding and Theorem 5.4.11 will add one more layer of sophistication to your understanding of the Borel-Cantelli world.

## Day 24 Continuing with Doob's Maximal Inequality

For warm-up problems we did quick proofs of Holder's inequality and --- more to the point --- Minkowski's inequality. The second proof introduced the principle of the "flop".

We then reviewed the Doob's L^1 maximal inequality and examined Wiener's famous argument for passing from a weak-type inequality to an L^p inequality. This is a lovely computation with many appearances in mathematics.

We then took up the famous ABRACADABRA problem and did an exact computation of the time until a lucky monkey types the magic word. Finally, we looked back at the up-crossing inequality and noted some almost sure convergence theorem. We also considered the analogy between submartingales and increasing sequence of real numbers.

## "Wednesday is Friday" --- So No Class 11/26

I am delighted that you can travel safely. Have fun and eat responsibly! See you back on December 1. We'll continue from Doob's inequality.

## Day 23 Doob's Inequalities and Convergence Theorems

We start with two warm-ups. First there is a mild epiphany that relates integration by parts to symmetry and translation invariance; second there is a high-altitude view of the bias-coin martingale-- we did a "change of group."

The main goal is to continue with bread-and-butter martingale theory. In particular, we'll work toward several proofs of the martingale convergence theorem, of which there are several flavors. To develop these proofs we need first to "robustify" the notion of martingale. This takes us to the notions of a submartingale (good game) and supermartingale (bad game).

The named results that are highest on our screen are Doob's maximal inequality and Doob's up-crossing inequality. One gets martingale convergence theory as a corollary to either of these.

Note: The Wiki does a sub par job on Doob's maximal inequality. You'll want to have a better version in your pocket. The Wiki Martingale Convergence Theorem page is better, but it is still a little weak.

## Day 22. Martingale Theory Begins Now

We'll have two useful warm-up problems. First we'll see how the CLT combines with a nice Poisson telescoping argument to give the world's slickest proof of Stirling's formula. We'll then discuss a tiny piece of characteristic function theory that we did not cover before; it's called the Riemann-Lebesgue Lemma and it comes in several flavors.

We'll then continue with the intuitive use of martingales. In particular, we'll get the ruin probability for a game with a biased coin. We'll also look at some linear algebra connections --- a rich area for discovery.

**Next comes core theory.** We'll look at the notion of a stopping time and start working to what are called the "systems theorems". These are the theorems that justify the "fairness principle". We'll also start to work with martingale transforms and we may get as far as proving Doob's maximal inequality.

I'll hand back the mid-term exams and post the stem-and-leaf plot so you'll see where you stand vis-a-vis your peers.

## Day 21: Introduction to Martingales

We'll start with a reminder of the definition of conditional expectation --- the formal one in terms of sigma-fields. We'll then look at the principle of "conditional everything"; specifically we'll get a conditional Cauchy-Schwarz, a conditional Holder, a conditional Jensen, and a conditional Markov.

We'll then start working with the definition of a martingale. This is a brilliant mathematical structure that began life as a model for fair games. For one day, and one day only, we will stress the intuitive side of martingales. In particular, we'll introduce an idea called the fairness principle and we'll use it to solve several down-to-earth problems. The most important of these is the calculation of the ruin probability for biased random walk. There are many people whose lives would be changed vastly for they better if they could simply understand this one formula at an honestly visceral level.

While the wikipedia article on martingales may be worth a look, this is one time where I think the article really stinks. I understand why some of the expository decisions were made, but, in due course, I'll explain why I think they made bad decisions. Unfortunately the mess looks to big to fix, but maybe you will see a way to make it at least a little better.

## Final Exam: Planning Ahead

I will shortly post the first part of the take-home final exam. I will continue to add problems to the exam until the last day of class. The instructions of the exam are written in considerable detail, but the most important detail is that you are not to talk to anyone about any of the problem nor or you to get any information from any person in any way. **The final exam is absolutely your own work. You will be required to sign an acknowledgment that you have read and understood the conditions of the exam.**

## Uncluttered Proof of the Spectral Theorem

There is an almost unending sequence of theorems that are called the "spectral theorem" but the undeniable starting point is the result that if A is a symmetric NxN real matrix then then A has N real eigenvalues and there is a basis consisting of N real orthonormal eigenvectors. The typical proof of this theorem jiggles around developing various useful facts before getting to the moneyball. Wandering the web, I hit upon a very attractive, uncluttered proof. This will not be used in 530 but it will be useful in 531.

## Midterm Reminder

Your midterm is on **Monday November 10 from 10:30 until 11:50**. Please seat yourself as widely apart as you can. Please read the instructions on the cover page of the test before you begin. For information on the content of the test, please see the earlier blog post. As a reminder, you should expect the test to seem long. It is straightforward if you are well prepared. Nothing on the test should come as a surprise, but in any event don't let yourself get stuck. Be sure to give a full accounting of your knowledge.

## Day 20 Conditional Expectations in Earnest

This is a foundational day devoted to the definition and construction of conditional expectations. If this is done in absolute completeness, it will turn your hair gray and your bones to chalk. We will do just what we need to do to set up our next important topic martingales. Martingales can pay the rent. The foundations of conditional expectations just keep the pipes from leaking.

I'll hand back HW 10. This will complete the set. **The midterm will be 10:30-11:50 on Monday November 10.** The test will begin and and end on time. Description of the test has been given earlier in the blog. **There will be no class on Wednesday November 12,** this is your own very special Fall break.

For a variety of reasons, this could be a good time to review the Course Policies.

## Day 19 Poisson Approximation: Two Ways

We continue our conversation on Poisson approximation. First we'll recall the coupling argument that gives Le Cam's inequality. We'll then look at a semi-group proof of Le Cam's inequality and marvel at the difference between the two.

The other big topic of the day is conditional expectation. We saw last time that the confident calculation of conditional expectations is not guaranteed. We need to build a study foundation, and this takes us back to the business of sigma-filed's.

Finally, there are some small problems floating around that we have mentioned but not completed. One of these is recurrence of random walks in dimensions 1 and 2 plus the transience in dimension 3 and higher. This is Polya's theorem for random walk and it sets the stage for many questions in the theory of Markov chains. We'll also discuss the midterm.

## Nice Pictures of Taylor Approximations of e^z

You think you understand something pretty well, like the Taylor expansion of e^z, but then you look at some pictures and you see, wow, there are mysteries here!

## Midterm Heads Up

I have written the mid-term that you will have at class time on Monday, November 10. There are 8 problems: (a) two problems just require you to state theorems, inequalities, lemmas, or other facts, (b) two problems are clones of HW problems, (c) one problem requires the use of some kind of subsequence argument (or else you'll need to find a method that is different from mine), (d) one problem gives a list and asks "Is it a CHF or not?", and (e) two problems are not easy to classify but they are intended to test understanding of modes of convergence. There may be changes before test time, but they are not likely to be substantial.

## Writing about Programming

I just hit upon a long, interesting document at Stanford CS about writing for the mathematical and computing sciences. I think it is marvelous, but its is not for reading at a single sitting. You might want to dip in to it from time to time.

## Day 18: Lindeberg's Motivation, TV Distance and Coupling

We had two warm-up discussion. First we looked at the strange density 1/(\pi cosh x) and its transform, and we pondered the analogy with the normal density. We then gave a picture proof of the convolution formula that we needed in our mollification discussion.

Next we considered the problem that motivated Lindeberg, the number of cycles in a random permutation. We solved this by showing this was the same as the number of records in a sequence of n observations from independent, continuously distributed random variables. Incidentally, the Wikipedia article on Euler's constant has some striking facts; all we needed was that H_n and log n are asymptotic.

We then looked at Poisson approximation, first proving the classic limit then discussing the amazingly more general and more powerful Le Cam inequality. This lead to a discussion of the total variation distance and the notion of coupling. We drew the picture for the Serfling coupling, but we left the final steps of Le Cam's inequality for next time.

Finally, we looked at a "paradox" in conditional expectations.

## Stein's Method Resources

In the next few lectures we'll be continuing with Stein's method for CLT and Poisson Approximation. Two resources for this are the AMM paper by Larry Goldstein and my AMM paper on Le Cam's Inequality. Some of this material is in Durrett, but these papers give a little richer view.

## Day 17: Lindeberg Swapping, Stein's Method, and More

For the warm-up we'll consider a problem that illustrates the use of Parceval's formula and the principle of "smooth truncation." It also adds one more layer of knowledge about the Dirichlet integral.

The main task will be to explore Lindeberg's Swapping Trick. This argument is of very wide applicability and it gives the proof of the CLT that has the absolute least overhead. This trick predates Stein's method, but now they are often viewed as intertwined. I'll draw out the distinctions as the discussion progresses. We will need to develop some general facts bout convergence in distribution.

Here is our rolling "future topics" list: (i) an Introduction to Stable Laws (ii) the Evans and Zhou construction, (iii) recurrence and transience of random walk (iv) a surprising density/chf pair with the reciprocal of cosh(x) on both sides (v) exchangeable random variables (vi) conditional expectation (examples) (viii) Minkowski inequality, (viii) Poisson approximations.

HW 10 (PDF and Latex Source). This is due on November 3 and it is our last HW. There are only four problems.

You will have an** in-class mid-term on November 10 **and there will be no class November 12. On November 17 I will start handing out the problems for the take-home final. I will continue to add to the list until December 8, our last day of class. The final exam will be due on December 17 at NOON. This official day of our final exam as posted by the registrar. You'll get further details when I start to hand out the problems.

## Day 16: More on Moments and Distributions

As a warm-up, we developed the CHF for the double exponential and the Cauchy. We also discussed the notion of entropy of a random variable and its special connection to the uniform distribution.

The main task was to complete the discussion of the "moment problem," e.g. when do (or don't) the moments of a random variable determine its distribution. In the positive direction the trick is to use infinitely many Taylor expansions.

We briefly discussed the relationships between matchings in a graph and normal moments. In particular the number of perfect matchings of the complete graph on 2n vertices is equal 2n'th moment of a standard normal. We then discussed the Geometry of Cauchy's Distribution ( i.e. the random pistol range problem).

In due course, may explore Stein's method for the CLT. The essay by Chaterjee has much more than we will ever consider, but it can give you a sense of the modern development of Stein's method. Take a peak if you are feeling ambitious.

## Laplace Approximation for Integrals

The Wikipedia article on the Laplace approximation is pretty good. In particular, the proof that is given there is honest and complete. What the Wiki covers is just the classic case, so a little extra work is needed to cover Wallis integral. Still, once you know how to do one of these cases, they are all very much the same.

## Day 15: Using the CLT and CHF Technology

This is a problem oriented day where we will look at an eclectic set of questions and their solutions. One question involves a fixed point equation, and we solve it using the CLT. We then looked at a heuristic derivation of Parceval's Theorem, which gives us one of the most important identities in mathematics.

We then discussed what it means for a sequence characteristic functions just to converge in the neighborhood of zero. In the special case when the limit CHF is 1, this turns out to be enough. This fact can be used to prove sophisticated versions of the Weak Law of Large Numbers. Finally, we began the discussion of the determination of a distribution by its moments.

HW 9 (PDF Latex Source) is due until Monday October 27.

## Logistic Distribution, CHFs, and Elo++

We'll shortly look a some more "exotic" distributions and their characteristic function theory. While playing with the logistic distribution, I hit upon an interesting paper on chess rating systems and their predictive power. If you are interested in the prediction of outcomes of contests, it is worth a look.

## Mid-Term Rules

In an in-class exam, it is not "fair" to expect people to come up with the solution to an original problem. On the other hand, it is certainly fair to ask one to prove almost anything that has been proved in class. The "thing" probably can't be too "big" but **any piece** of the SLLN, L^1 0r L^2 maximal inequality, Levy's inversion formula, Levy's continuity theorem, or Lindeberg CLT should be fair game. One can add more to the list, but you get the point. Also definitions and theorem statements --- oh, yeah, they are all fair game.

## Day 14: CLTs --- Basic and Lindeberg

This is a "money ball" day. There is nothing more central to probability theory than the central limit theorem. This is material that everyone should absolutely master, and we cover the ground carefully.

After reviewing the ingredients, we bake the cake. First it is the basic CLT --- an honest adult version where we assume that we have second moments and nothing more.

We then consider the **Lindeberg CLT** in two phases: (a) understanding the Lindeberg set up and the Lindeberg condition and (b) proving the Lindeberg CLT. Pleasantly enough, the proof follows the pattern that was used for the basic CLT.

If time permits, there are buckets of smaller topics on our plate: entropy, Parceval's formula, and the relation between moments of random variables and derivatives of Ch.fs --- perhaps including the solution of some recent HW problems.

## Information Theoretical Entropy

It won't be for several days, but eventually we'll discuss the notion of entropy as it appears in information theory. As usual, the Wikipedia article is a little diffuse, but it is still worth a quick read. We'll only use entropy in one HW problem, but it is a brilliant entity that everyone should ponder to some extent.

## Day 13. Applications of Levy's Continuity Theorem

We began with a warm-up problem that gave a basic inequality for the remainder in the Taylor approximation of e^{ix}. It is simple, but there are some subtitles to it. We then looked at what this inequality tells us for bounded random variables.

We then revisited Kolmogorov's one series theorem and contemplated its converse. This lead us relatively quickly to the famous Kolmogorov Three Series Theorem ---the K3S. Our main tool was the characteristic function for a bounded random variable. We looked at a "double triangle diagram" that captures the logic of the K3S. Such diagrams are a good way to get a long argument into ones head.

**Homework 8**: The PDF and the Latex Source. This is due on Monday October 20.

Just as a general heads-up, the **final exam will be a take-home** that will be due on **Wednesday December 17 at Noon in my mail box (JMHH Suite 400, Department of Statistics).** I will hand out the first tranche of the final before the Thanksgiving Holiday and I will then add problems until the last day of class, Monday December 8.

## Day 12. Levy's Continuity Theorem

For the warm-up we looked at a simple example of an argument that one can awkwardly call --"uniqueness plus compactness gives convergence."

We then proved Levy's continuity theorem. The the basic ingredients were (a) the "continuity at zero implies tail bound" inequality and (b) Helly's selection theorem, which we proved last time. The proof of Levy's continuity theorem then followed the "uniqueness plus compactness gives convergence" dance.

Our first application of Levy's continuity theorem was the proof that all of the Polya type functions are actually characteristic functions. This result may look special, but it is very handy for answering many questions. It is especially useful for building examples.

Finally we discussed extreme points and Choquet's theorem. This is a rich but technical subject. Nevertheless, just knowing the basic ideas of the theory can add greatly to ones intuition. In fact, one often uses Choquet's theorem to guess what is true, but, once a conjecture has been framed, it is often easier to prove the conjecture by concrete means that rather than check the details needed to prove Choquet's theorem. Our algorithmic proof of Polya's theorem illustrated this idea.

## Choquet's Theorem

Sometime in the next few lectures, we'll bump in to Choquet's theorem, a result that seems pretty obvious but which has some remarkable consequences. We won't cover Choquet's theorem in detail, but it is worth knowing about because it suggests many results that one can also prove "with bare hands." The Wikipedia article on Choquet's theorem assumes some familiarity with functional analysis, but it can still offer a useful introduction.

## Day 11. Convergence in Distribution: Fourier View

We'll look at the notion of convergence in distribution of real valued random variables, and a few words will be tucked in on "weak" convergence of random variables that take values in a metric space.

The main theorem in our sights is Levy's continuity theorem which we'll do next time. We set this up by proving the Helly selection theorem. This involved

- The notion of tightness and illustration by example
- Recollection of the diagonal argument
- The proof of Helly's theorem.

After getting Helly's selection theorem, we used the Levy inversion formula to find a density that has a "tent" for it's characteristic function. We'll also use this next time.

In due course, we will develop the smoothness and moment relationships, density characteristic function pairs, Polya's density, Polya type characteristic functions, WLLN, and Kolmogorov 3 Series theorem. We'll take one step at a time. This kind of activity will be with us for at least two more weeks.

HW 7 (and Latex source). Due Monday October 13.

## Arzela-Ascoli Theorem

The Wikipedia article on the Arzela-Ascoli theorem is decent and reading this piece is good preparation for our proof of the continuity theorem for characteristic functions. You may find better discussions in standard textbooks, like Rudin.

*A priori*, the article on the Helly section theorem would be more relevant. Unfortunately, the venerable Wiki pays too much attention to the bounded variation case, and we just need the simpler monotone case.

## Coaching on HW 6 Problem 1

This problem really can teach you something about problem solving and I hate to give a hint that is so large that it gives the problem away. Still, some people are finding themselves going in circles so I offer some coaching:

- There is no need to reinvent the wheel. You can use the Paley-Zygmund inequality; you don't have to rederive it.
- If you stare at PZ, you'll see that it suffices to show E[S_n^2] < 5( E[S_n])^2.
- This means that you need to estimate E[S_n^2]
**in terms**of E[S_n]^2. This should give you the focus you need to keep from going in circles. - If you need one more suggestion, don't forget the relations you know for Var[S_n]. These put some cancellations into the game that can help.

## Day 10. Levy's Inversion Formula

We've discussed the Levy's inversion formula and we have developed the basic tool for its proof --- the finite form of Dirichlet's Discontinuous Integral. Today we'll lay out the proof of Levy's Formula and some corollaries. Further, we'll see how Levy's formula is part of "a theory", one of several that is embedded into what we call probability theory.

Other agenda items require a less sustained development, but they are important when the time comes to use characteristic functions. Specifically, we'll develop the relationships between characteristic functions and derivatives. We'll also look at other manifestation of the meta-principle that "the behavior of the ch.f. at 0 tells you about the behavior of the tail of the distribution, and vice-versa." This principle will be an important guide for the next few days.

## Heads Up --- NNQ

There will be a **No-Name-Quiz on Wednesday.** Please review the most important results that we have covered in the last two weeks, and get them into short-term memory. They can never be part of long-term memory if they don't first make it to short-term memory!

The venerable Wikipedia does a decent biographical job for Dirichlet, but its discussion of Dirichlet's discontinuous integral only gets a B minus. Feel free to edit it.

On other business, there are two more proofs of the L^1 maximal inequality based on two lemmas of independent interest. These are the "leader lemma" and the "rising sun lemma". One of the reasons these interest me is that they are just facts about real numbers; it's hard to get closer to the bone.

## Day 9. Characteristic Functions and Dirichlet's Integral

For the warm-up problems, we'll look at examples of characteristic functions and Laplace transforms. We then look at some general properties of characteristic functions including the uniform continuity property and the positive definite property.

Next we look at the integral of sin x over x. We will cherry pick from the scholium on sin(x)/x but there will be lots left for you to read. An important goal will be to give a clear and complete discussion of Dirichlet's Integral. You will want to digest this pretty quickly; it will be needed next class, and you don't want it to be a mystery.

As time permits, we'll discuss other properties of catechistic functions and look at some big picture issues. We'll be doing classical mathematics for a few days, and you want to make sure you can see the forest for the trees --- however lovely the trees may be.

Homework 6 is due on Monday October 6. If you have not been using **Latex** yet, it is time to bite the bullet. From this point onward, Latex is required. You can save a little typing if you start with my Latex source for the problems. Be sure to read the text through section 3.3. Not everything can be covered in class.

## Homework Policies

We're well into the homework process now, but it seems that some reminders are needed about HW policies. You are strongly encouraged to do the HWs individually. **At lease 80% should be done individually.** If you collaborate with a fellow student on a problem or if you get a hint from a friend, you should acknowledge this by a written statement on your HW.

## Scholium on Sin X over X

We will shortly need some information about the integral of sin(x)/x. We won't need a ton of information, but this topic offers a beautiful window on classical mathematics. You should take a peek if you have the time. I've written a brief and leisurely scholium on sin(x)/x, which has more than we need --- but you may still find it useful (even entertaining). Among other things, it gets the Laplace transform of sin (x) by five different methods.

## Day 8. Proof of the L^1 Maximal Inequality

The warm-up problems will deal with the Laplace transform and the MGF. This is both to set up the work with transforms in Chapter 3 and to remind everyone about some basic calculus (e.g. the Gamma function, differentiation under the integral etc.)

The main task is to use first-step analysis to prove the maximal inequality that was introduced earlier. Naturally, we will follow the teaching note which I hope you have previewed.

As time permits we'll revisit the Paley-Zygmund argument and look at some of its variations. I have written some of this out in a note on lower bounds. We may not cover all of this, and we may revisit it from time to time.

## Day 7. An L^1 Maximal Inequality and a Proof of the SLLN

We'll have a couple of warm-up problems that use the moment generating function. One of these gives us a version of Hoeffding's Lemma. This leads in turn to a powerful concentration inequality that makes the SLLN for bounded random variables "trivial".

We introduced another maximal inequality, the L^1 maximal inequality. This has notable benefits over Kolmogorov's L^2 maximal inequality and it gives a very direct proof of the SLLN. We did not prove the L^1 maximal inequality but we'll get it next time.

We then looked at some applications of technology of the ** weak law of large numbers**. We gave Bernstein's proof of Weierstrass approximation and we started to look at the proof that the Laplace transform determines the function that was transformed. We did not complete the argument but we reviewed some of the needed facts about gamma family of densities.

HW 5 will be due on Monday, September 29.

## HW 4 Note --- A Complete Graph of Inferences

In problem 4 of HW4, you will want to avoid use of the DCT if you want the benefit of a new proof of the DCT in problem 5. Here you can use the MCT, but perhaps you can even avoid the used of the MCT. Part of the point here is that you can essentially form a complete graph of the implications between the MCT, DCT, and Fatou. Any given book chooses and order, and starting with the MCT is most natural, but all of the 6 permutations can be executed.

## Easiest Proof of SLLN?

We will shortly go over what I believe to be the easiest, and most direct proof of the SLLN. If you want to read this ahead of time, please check out the teaching note. Also, if you have any feedback about this note, it would be useful to me. I'd appreciate anything from typos to "here is where I get stuck".

## Day 6. Adult Strength SLLN

We begin with a NO NAME quiz. You will be asked to write down the answers to three questions I that will write on the blackboard. Two of the questions will ask you to give full and correct statements of some lemmas, theorems, or facts that we have developed in class. One question is about calculus.

We'll go from the quiz to two warm-up problems that should put firmly in mind a couple of ideas that we need later in the class. We then prove the SLLN under the assumption of IID random variables and *just a finite first moment.*

This takes a sustained argument with three slices:

**A DCT slice**--- pretty trivial but requiring some knowledge.**A BC lemma slice**--- again easy, but requiring a knowledge of a calculation and a lemma.- A
**Kolmogorov One Series**Slice. Here we use V1.1, and we have to do a nice calculation to make it tick. The warm-up problems help us here.

This argument deserves to be mastered, and we'll take our time with it. If we do have time to spare, we'll revisit the proof of Levy's series theorem which we only sketched last time. We may also look at the highly flexible Paley-Zygmund argument.

## Day 5. Series Theorems of Kolmogorov and Levy

We begin with a warm-up problem or two: The proof of Jensen's inequality is one of these.

We then put the Cauchy criterion into the language of random variables. In particular, we put the difference between convergence in probability and convergence with probability one into a tidy analytical box. Everyone needs to sort out the ways to express what it means for a sequence to be almost surely Cauchy, and everyone needs to see how this differs from the* definition *of a sequence being Cauchy in probability.

We use Kolmogorov's maximal inequality to prove Kolmogorov's One Series Theorem v1.0 and we note that the same argument gives us v1.1. We'll note how the Kronecker lemma then gives us the SLLN for IID random variables with a finite variance.

We'll then prove a curiously general maximal inequality due to Paul Levy, and we'll use this inequality to prove Levy's Series Theorem which says that for series of independent summands convergence in probability and convergence with probability one are equivalent.

Homework 4 is due Monday September 22.

Note: Some people did not do well on HW2 because they did not have deep enough experience with delta-epsilon proofs. This is not a deficit that one can make up with a little extra work; most people need a solid one-year course in analysis to succeed in 530. The demands on your analysis skill will start piling up very substantially, so, if you are not confident in your analysis skill, you should consider switching to auditor status.

## 9/11 Coaching Added

I added a tiny bit of Cauchy coaching to the end of the HW3 assignment. It may keep you from getting confused in one of the problems.

## Day 4. More Techniques for the SLLN

We begin with three warm-up problems. They are motivated by what they suggest about problem solving. They also give us some facts that we'll need later.

We then give a proof of the SLLN for i.i.d. random variables with a finite variance. This is a "two trick" proof: (a) passing to a subsequence (b) using monotone interpolation to solve the original problem. We'll see many proofs of this theorem. This version is one of the simplest and most direct.

The task then becomes the proof of the SLLN under the most natural conditions where we only assume we have a finite first moment. We'll approach this by first considering infinite sums --- of real numbers and of random variables. This will lead us to the consideration of our first maximal inequality, Kolmogorov's "Weak Type L^2" maximal inequality.

You may need to review the Cauchy criterion. The venerable Wikipedia is a little lame this time but there is a useful discussion posted by an Oxford professor. It is not sophisticated but it is worth reading.

## Day 3. First Look at Limit Laws

We'll revisit BCII and give a generalization. The proof illustrates two basic "tricks": The benefit of working with non-negative random variables and the juice one can get by "passing to subsequences." We'll see other versions of BCII as the course progresses.

We'll then prove a couple of easy versions of the Strong Law of Large Numbers. These results are not critical in and of themselves, but the techniques are very important. You can make a modest living with just the techniques that are covered today.

We'll may also prove a version of the Kolmogorov maximal inequality. The theory and applications of maximal inequalities is one of the big divides between elementary probability theory and graduate probability theory. Maximal inequities are not particularly hard, but they create a shift in the sophistication of the conversation.

Homework 3 is due on Monday, September 15.

## Comment on Homework

There were too many people who did not do the first homework. This could have happened for a variety of reasons, but it should not happen in the future. Please read the policies on homework. I can guarantee you that you will learn more by doing the homework than from any other part of the course.

Please also keep up with the blog. In particular, you should always be on the lookout for bug reports on the current homework.

## Fatou's Lemma

The Wikipedia article on Fatou's Lemma is surprisingly good. In addition to the usual proof it gives a "direct proof" (without the MCT). It also gives a version with "changing measures" that was new to me. It looks handy. Finally, it discusses the "conditional version" which we will need toward the end of the semester. Some attention is needed to the difference between the (easier) probability spaces and the (only slightly harder) spaces that do not have total mass equal to 1.

## Homework Comments 9/4 and 9/7

I corrected a bug in the last problem on Homework No. 2 and a bug in the hint. Please look at the corrected version. BTW, this is a reminder that when you see what looks like a bug on a problem you should (a) check the web page for a correction and (b) if there is no posted correction, then send me email so that I can sort things out.

## Day 2. MCT, DCT, Fatou, Etc.

We look at the fundamental results of integration theory from the probabilist's point of view. After asking what one wants from an "expected value", we look at Lebesgue's answer --- and see why it is surprising. We'll then do "problem solving" to discover a proof of the monotone convergence theorem --- after first finding the 'baby' MCT. With the MCT in hand, Fatou's Lemma is easy. With Fatou in hand, the Dominated Convergence Theorem is easy.

We'll then look at how one can estimate some expectations and look at three very fundamental inequalities. As time permits, we'll revisit the second Borel Cantelli lemma, perhaps giving two proofs.

Homework No. 2 will be due on Monday September 8 at class time. This will put us into a regular schedule of new homework posted each Monday and due the following Monday.

#### Office Hours: I will have office hours on Monday and Wednesday, 3pm to 4pm. Our course TA Peichao Peng will have office hours Tuesday and Thursday 4:30-5:30.

## Midterm Exam: Monday, November 10 (Class Time)

This is super-advanced notice of our midterm exam. I will give more details about the exam as the time gets closer, but the short version of the plan is that it should be about **factual knowledge** not about cleverness or problem solving. Those skills are left for the homeworks and the take home final. Since the midterm exam is about knowledge, no notes are permitted. The exam will count, but it will not count too heavily. It will have a 10% weight in the total grade. Full Discloser: I will be at the INFORMS annual meeting for the most of the week of November 10. There will be no class on November 12.

## Day 1. Getting Right to Work

We will go over the plan for the course and then get right to work. The main idea is independence, and some simple questions lead to the need for some new tools. We'll also meet two of our most constant companions: the two Borel Cantelli lemmas. We'll give proofs of these, and then start looking at applications. In the course of events, you will be reminded of various ideas from real analysis, especially limsup and liminf.

Homework No. 1 is due on Wednesday September 3.

This website is the place where one checks in to find the current homework and all of the additional information about our course, including periodic postings of supplemental material.

You can to look at the course syllabus for general information about the course as well as information about grading, homework, the midterm, and the final exam.

Please do review the course policies. I count on participants to read and follow these policies. They are quite reasonable, and it is awkward to have to single out an individual for not following our few rule.

Feel free to contact me if you have questions about the suitability of the course for you. In general the course will only be appropriate if you have had a **solid background in real analysis**, preferably at the graduate level.