## Course Blog Fall 2014

## Day 7. An L^1 Maximal Inequality and a Proof of the SLLN

We'll have a couple of warm-up problems that use the moment generating function. One of these gives us a version of Hoeffding's Lemma. This leads in turn to a powerful concentration inequality that makes the SLLN for bounded random variables "trivial".

We'll then prove another maximal inequality, the L^1 maximal inequality. This has notable benefits over Kolmogorov's L^2 maximal inequality. This will be based on the teaching note that I have been advertising for a while now.

As time permits we'll look at further implications of the Paley-Zygmund argument, or we'll look at some applications of the** weak law of large numbers **(e.g. Weierstrass approximation, Laplace uniqueness, and the unit cube as a torus.)

When you get a moment you should review the gamma family of densities. We'll need some basic facts about these shortly --- at a minimum we need the definition of the Gamma function. This family provides instructive examples of how one can use the MGF and the Laplace transform.

HW 5 will be due on Monday, September 29.

## HW 4 Note --- A Complete Graph of Inferences

In problem 4 of HW4, you will want to avoid use of the DCT if you want the benefit of a new proof of the DCT in problem 5. Here you can use the MCT, but perhaps you can even avoid the used of the MCT. Part of the point here is that you can essentially form a complete graph of the implications between the MCT, DCT, and Fatou. Any given book chooses and order, and starting with the MCT is most natural, but all of the 6 permutations can be executed.

## Easiest Proof of SLLN?

We will shortly go over what I believe to be the easiest, and most direct proof of the SLLN. If you want to read this ahead of time, please check out the teaching note. Also, if you have any feedback about this note, it would be useful to me. I'd appreciate anything from typos to "here is where I get stuck".

## Day 6. Adult Strength SLLN

We begin with a NO NAME quiz. You will be asked to write down the answers to three questions I that will write on the blackboard. Two of the questions will ask you to give full and correct statements of some lemmas, theorems, or facts that we have developed in class. One question is about calculus.

We'll go from the quiz to two warm-up problems that should put firmly in mind a couple of ideas that we need later in the class. We then prove the SLLN under the assumption of IID random variables and *just a finite first moment.*

This takes a sustained argument with three slices:

**A DCT slice**--- pretty trivial but requiring some knowledge.**A BC lemma slice**--- again easy, but requiring a knowledge of a calculation and a lemma.- A
**Kolmogorov One Series**Slice. Here we use V1.1, and we have to do a nice calculation to make it tick. The warm-up problems help us here.

This argument deserves to be mastered, and we'll take our time with it. If we do have time to spare, we'll revisit the proof of Levy's series theorem which we only sketched last time. We may also look at the highly flexible Paley-Zygmund argument.

## Day 5. Series Theorems of Kolmogorov and Levy

We begin with a warm-up problem or two: The proof of Jensen's inequality is one of these.

We then put the Cauchy criterion into the language of random variables. In particular, we put the difference between convergence in probability and convergence with probability one into a tidy analytical box. Everyone needs to sort out the ways to express what it means for a sequence to be almost surely Cauchy, and everyone needs to see how this differs from the* definition *of a sequence being Cauchy in probability.

We use Kolmogorov's maximal inequality to prove Kolmogorov's One Series Theorem v1.0 and we note that the same argument gives us v1.1. We'll note how the Kronecker lemma then gives us the SLLN for IID random variables with a finite variance.

We'll then prove a curiously general maximal inequality due to Paul Levy, and we'll use this inequality to prove Levy's Series Theorem which says that for series of independent summands convergence in probability and convergence with probability one are equivalent.

Homework 4 is due Monday September 22.

Note: Some people did not do well on HW2 because they did not have deep enough experience with delta-epsilon proofs. This is not a deficit that one can make up with a little extra work; most people need a solid one-year course in analysis to succeed in 530. The demands on your analysis skill will start piling up very substantially, so, if you are not confident in your analysis skill, you should consider switching to auditor status.

## 9/11 Coaching Added

I added a tiny bit of Cauchy coaching to the end of the HW3 assignment. It may keep you from getting confused in one of the problems.

## Day 4. More Techniques for the SLLN

We begin with three warm-up problems. They are motivated by what they suggest about problem solving. They also give us some facts that we'll need later.

We then give a proof of the SLLN for i.i.d. random variables with a finite variance. This is a "two trick" proof: (a) passing to a subsequence (b) using monotone interpolation to solve the original problem. We'll see many proofs of this theorem. This version is one of the simplest and most direct.

The task then becomes the proof of the SLLN under the most natural conditions where we only assume we have a finite first moment. We'll approach this by first considering infinite sums --- of real numbers and of random variables. This will lead us to the consideration of our first maximal inequality, Kolmogorov's "Weak Type L^2" maximal inequality.

You may need to review the Cauchy criterion. The venerable Wikipedia is a little lame this time but there is a useful discussion posted by an Oxford professor. It is not sophisticated but it is worth reading.

## Day 3. First Look at Limit Laws

We'll revisit BCII and give a generalization. The proof illustrates two basic "tricks": The benefit of working with non-negative random variables and the juice one can get by "passing to subsequences." We'll see other versions of BCII as the course progresses.

We'll then prove a couple of easy versions of the Strong Law of Large Numbers. These results are not critical in and of themselves, but the techniques are very important. You can make a modest living with just the techniques that are covered today.

We'll may also prove a version of the Kolmogorov maximal inequality. The theory and applications of maximal inequalities is one of the big divides between elementary probability theory and graduate probability theory. Maximal inequities are not particularly hard, but they create a shift in the sophistication of the conversation.

Homework 3 is due on Monday, September 15.

## Comment on Homework

There were too many people who did not do the first homework. This could have happened for a variety of reasons, but it should not happen in the future. Please read the policies on homework. I can guarantee you that you will learn more by doing the homework than from any other part of the course.

Please also keep up with the blog. In particular, you should always be on the lookout for bug reports on the current homework.

## Fatou's Lemma

The Wikipedia article on Fatou's Lemma is surprisingly good. In addition to the usual proof it gives a "direct proof" (without the MCT). It also gives a version with "changing measures" that was new to me. It looks handy. Finally, it discusses the "conditional version" which we will need toward the end of the semester. Some attention is needed to the difference between the (easier) probability spaces and the (only slightly harder) spaces that do not have total mass equal to 1.

## Homework Comments 9/4 and 9/7

I corrected a bug in the last problem on Homework No. 2 and a bug in the hint. Please look at the corrected version. BTW, this is a reminder that when you see what looks like a bug on a problem you should (a) check the web page for a correction and (b) if there is no posted correction, then send me email so that I can sort things out.

## Day 2. MCT, DCT, Fatou, Etc.

We look at the fundamental results of integration theory from the probabilist's point of view. After asking what one wants from an "expected value", we look at Lebesgue's answer --- and see why it is surprising. We'll then do "problem solving" to discover a proof of the monotone convergence theorem --- after first finding the 'baby' MCT. With the MCT in hand, Fatou's Lemma is easy. With Fatou in hand, the Dominated Convergence Theorem is easy.

We'll then look at how one can estimate some expectations and look at three very fundamental inequalities. As time permits, we'll revisit the second Borel Cantelli lemma, perhaps giving two proofs.

Homework No. 2 will be due on Monday September 8 at class time. This will put us into a regular schedule of new homework posted each Monday and due the following Monday.

#### Office Hours: I will have office hours on Monday and Wednesday, 3pm to 4pm. Our course TA Peichao Peng will have office hours Tuesday and Thursday 4:30-5:30.

## Midterm Exam: Monday, November 10 (Class Time)

This is super-advanced notice of our midterm exam. I will give more details about the exam as the time gets closer, but the short version of the plan is that it should be about **factual knowledge** not about cleverness or problem solving. Those skills are left for the homeworks and the take home final. Since the midterm exam is about knowledge, no notes are permitted. The exam will count, but it will not count too heavily. It will have a 10% weight in the total grade. Full Discloser: I will be at the INFORMS annual meeting for the most of the week of November 10. There will be no class on November 12.

## Day 1. Getting Right to Work

We will go over the plan for the course and then get right to work. The main idea is independence, and some simple questions lead to the need for some new tools. We'll also meet two of our most constant companions: the two Borel Cantelli lemmas. We'll give proofs of these, and then start looking at applications. In the course of events, you will be reminded of various ideas from real analysis, especially limsup and liminf.

Homework No. 1 is due on Wednesday September 3.

This website is the place where one checks in to find the current homework and all of the additional information about our course, including periodic postings of supplemental material.

You can to look at the course syllabus for general information about the course as well as information about grading, homework, the midterm, and the final exam.

Please do review the course policies. I count on participants to read and follow these policies. They are quite reasonable, and it is awkward to have to single out an individual for not following our few rule.

Feel free to contact me if you have questions about the suitability of the course for you. In general the course will only be appropriate if you have had a **solid background in real analysis**, preferably at the graduate level.