Course Blog Fall 2014
Heads Up --- NNQ
There will be a No-Name-Quiz on Wednesday. Please review the most important results that we have covered in the last two weeks, and get them into short-term memory. They can never be part of long-term memory if they never make it to short-term memory.
On other business, there are two more proofs of the L^1 maximal inequality based on two lemmas of independent interest. These are the "leader lemma" and the "rising sun lemma". We are unlikely to deal with these, but they are cool and if we get some spare time I'll explain why they interest me.
Day 9. Characteristic Functions and Dirichlet's Integral
For the warm-up problems, we'll look at examples of characteristic functions and Laplace transforms. We then look at some general properties of characteristic functions including the uniform continuity prosperity and the positive definite property.
Next we look at the integral of sin x over x. We will cherry pick from the scholium on sin(x)/x but there will be lots left for you to read. An important goal will be to give a clear and complete discussion of Dirichlet's Integral. You will want to digest this pretty quickly; it will be needed next class, and you don't want it to be a mystery.
As time permits, we'll discuss other properties of catechistic functions and look at some big picture issues. We'll be doing classical mathematics for a few days, and you want to make sure you can see the forest for the trees --- however lovely the trees may be.
Homework 6 is due on Monday October 6. If you have not been using Latex yet, it is time to bite the bullet. From this point onward, Latex is required. You can save a little typing if you start with my Latex source for the problems. Be sure to read the text through section 3.3. Not everything can be covered in class.
We're well into the homework process now, but it seems that some reminders are needed about HW policies. You are strongly encouraged to do the HWs individually. At lease 80% should be done individually. If you collaborate with a fellow student on a problem or if you get a hint from a friend, you should acknowledge this by a written statement on your HW.
Scholium on Sin X over X
We will shortly need some information about the integral of sin(x)/x. We won't need a ton of information, but this topic offers a beautiful window on classical mathematics. You should take a peek if you have the time. I've written a brief and leisurely scholium on sin(x)/x, which has more than we need --- but you may still find it useful (even entertaining). Among other things, it gets the Laplace transform of sin (x) by five different methods.
Day 8. Proof of the L^1 Maximal Inequality
The warm-up problems will deal with the Laplace transform and the MGF. This is both to set up the work with transforms in Chapter 3 and to remind everyone about some basic calculus (e.g. the Gamma function, differentiation under the integral etc.)
The main task is to use first-step analysis to prove the maximal inequality that was introduced earlier. Naturally, we will follow the teaching note which I hope you have previewed.
As time permits we'll revisit the Paley-Zygmund argument and look at some of its variations. I have written some of this out in a note on lower bounds. We may not cover all of this, and we may revisit it from time to time.
Day 7. An L^1 Maximal Inequality and a Proof of the SLLN
We'll have a couple of warm-up problems that use the moment generating function. One of these gives us a version of Hoeffding's Lemma. This leads in turn to a powerful concentration inequality that makes the SLLN for bounded random variables "trivial".
We introduced another maximal inequality, the L^1 maximal inequality. This has notable benefits over Kolmogorov's L^2 maximal inequality and it gives a very direct proof of the SLLN. We did not prove the L^1 maximal inequality but we'll get it next time.
We then looked at some applications of technology of the weak law of large numbers. We gave Bernstein's proof of Weierstrass approximation and we started to look at the proof that the Laplace transform determines the function that was transformed. We did not complete the argument but we reviewed some of the needed facts about gamma family of densities.
HW 5 will be due on Monday, September 29.
HW 4 Note --- A Complete Graph of Inferences
In problem 4 of HW4, you will want to avoid use of the DCT if you want the benefit of a new proof of the DCT in problem 5. Here you can use the MCT, but perhaps you can even avoid the used of the MCT. Part of the point here is that you can essentially form a complete graph of the implications between the MCT, DCT, and Fatou. Any given book chooses and order, and starting with the MCT is most natural, but all of the 6 permutations can be executed.
Easiest Proof of SLLN?
We will shortly go over what I believe to be the easiest, and most direct proof of the SLLN. If you want to read this ahead of time, please check out the teaching note. Also, if you have any feedback about this note, it would be useful to me. I'd appreciate anything from typos to "here is where I get stuck".
Day 6. Adult Strength SLLN
We begin with a NO NAME quiz. You will be asked to write down the answers to three questions I that will write on the blackboard. Two of the questions will ask you to give full and correct statements of some lemmas, theorems, or facts that we have developed in class. One question is about calculus.
We'll go from the quiz to two warm-up problems that should put firmly in mind a couple of ideas that we need later in the class. We then prove the SLLN under the assumption of IID random variables and just a finite first moment.
This takes a sustained argument with three slices:
- A DCT slice --- pretty trivial but requiring some knowledge.
- A BC lemma slice --- again easy, but requiring a knowledge of a calculation and a lemma.
- A Kolmogorov One Series Slice. Here we use V1.1, and we have to do a nice calculation to make it tick. The warm-up problems help us here.
This argument deserves to be mastered, and we'll take our time with it. If we do have time to spare, we'll revisit the proof of Levy's series theorem which we only sketched last time. We may also look at the highly flexible Paley-Zygmund argument.
Day 5. Series Theorems of Kolmogorov and Levy
We begin with a warm-up problem or two: The proof of Jensen's inequality is one of these.
We then put the Cauchy criterion into the language of random variables. In particular, we put the difference between convergence in probability and convergence with probability one into a tidy analytical box. Everyone needs to sort out the ways to express what it means for a sequence to be almost surely Cauchy, and everyone needs to see how this differs from the definition of a sequence being Cauchy in probability.
We use Kolmogorov's maximal inequality to prove Kolmogorov's One Series Theorem v1.0 and we note that the same argument gives us v1.1. We'll note how the Kronecker lemma then gives us the SLLN for IID random variables with a finite variance.
We'll then prove a curiously general maximal inequality due to Paul Levy, and we'll use this inequality to prove Levy's Series Theorem which says that for series of independent summands convergence in probability and convergence with probability one are equivalent.
Homework 4 is due Monday September 22.
Note: Some people did not do well on HW2 because they did not have deep enough experience with delta-epsilon proofs. This is not a deficit that one can make up with a little extra work; most people need a solid one-year course in analysis to succeed in 530. The demands on your analysis skill will start piling up very substantially, so, if you are not confident in your analysis skill, you should consider switching to auditor status.
9/11 Coaching Added
I added a tiny bit of Cauchy coaching to the end of the HW3 assignment. It may keep you from getting confused in one of the problems.
Day 4. More Techniques for the SLLN
We begin with three warm-up problems. They are motivated by what they suggest about problem solving. They also give us some facts that we'll need later.
We then give a proof of the SLLN for i.i.d. random variables with a finite variance. This is a "two trick" proof: (a) passing to a subsequence (b) using monotone interpolation to solve the original problem. We'll see many proofs of this theorem. This version is one of the simplest and most direct.
The task then becomes the proof of the SLLN under the most natural conditions where we only assume we have a finite first moment. We'll approach this by first considering infinite sums --- of real numbers and of random variables. This will lead us to the consideration of our first maximal inequality, Kolmogorov's "Weak Type L^2" maximal inequality.
You may need to review the Cauchy criterion. The venerable Wikipedia is a little lame this time but there is a useful discussion posted by an Oxford professor. It is not sophisticated but it is worth reading.
Day 3. First Look at Limit Laws
We'll revisit BCII and give a generalization. The proof illustrates two basic "tricks": The benefit of working with non-negative random variables and the juice one can get by "passing to subsequences." We'll see other versions of BCII as the course progresses.
We'll then prove a couple of easy versions of the Strong Law of Large Numbers. These results are not critical in and of themselves, but the techniques are very important. You can make a modest living with just the techniques that are covered today.
We'll may also prove a version of the Kolmogorov maximal inequality. The theory and applications of maximal inequalities is one of the big divides between elementary probability theory and graduate probability theory. Maximal inequities are not particularly hard, but they create a shift in the sophistication of the conversation.
Homework 3 is due on Monday, September 15.
Comment on Homework
There were too many people who did not do the first homework. This could have happened for a variety of reasons, but it should not happen in the future. Please read the policies on homework. I can guarantee you that you will learn more by doing the homework than from any other part of the course.
Please also keep up with the blog. In particular, you should always be on the lookout for bug reports on the current homework.
The Wikipedia article on Fatou's Lemma is surprisingly good. In addition to the usual proof it gives a "direct proof" (without the MCT). It also gives a version with "changing measures" that was new to me. It looks handy. Finally, it discusses the "conditional version" which we will need toward the end of the semester. Some attention is needed to the difference between the (easier) probability spaces and the (only slightly harder) spaces that do not have total mass equal to 1.
Homework Comments 9/4 and 9/7
I corrected a bug in the last problem on Homework No. 2 and a bug in the hint. Please look at the corrected version. BTW, this is a reminder that when you see what looks like a bug on a problem you should (a) check the web page for a correction and (b) if there is no posted correction, then send me email so that I can sort things out.
Day 2. MCT, DCT, Fatou, Etc.
We look at the fundamental results of integration theory from the probabilist's point of view. After asking what one wants from an "expected value", we look at Lebesgue's answer --- and see why it is surprising. We'll then do "problem solving" to discover a proof of the monotone convergence theorem --- after first finding the 'baby' MCT. With the MCT in hand, Fatou's Lemma is easy. With Fatou in hand, the Dominated Convergence Theorem is easy.
We'll then look at how one can estimate some expectations and look at three very fundamental inequalities. As time permits, we'll revisit the second Borel Cantelli lemma, perhaps giving two proofs.
Homework No. 2 will be due on Monday September 8 at class time. This will put us into a regular schedule of new homework posted each Monday and due the following Monday.
Office Hours: I will have office hours on Monday and Wednesday, 3pm to 4pm. Our course TA Peichao Peng will have office hours Tuesday and Thursday 4:30-5:30.
Midterm Exam: Monday, November 10 (Class Time)
This is super-advanced notice of our midterm exam. I will give more details about the exam as the time gets closer, but the short version of the plan is that it should be about factual knowledge not about cleverness or problem solving. Those skills are left for the homeworks and the take home final. Since the midterm exam is about knowledge, no notes are permitted. The exam will count, but it will not count too heavily. It will have a 10% weight in the total grade. Full Discloser: I will be at the INFORMS annual meeting for the most of the week of November 10. There will be no class on November 12.
Day 1. Getting Right to Work
We will go over the plan for the course and then get right to work. The main idea is independence, and some simple questions lead to the need for some new tools. We'll also meet two of our most constant companions: the two Borel Cantelli lemmas. We'll give proofs of these, and then start looking at applications. In the course of events, you will be reminded of various ideas from real analysis, especially limsup and liminf.
Homework No. 1 is due on Wednesday September 3.
This website is the place where one checks in to find the current homework and all of the additional information about our course, including periodic postings of supplemental material.
You can to look at the course syllabus for general information about the course as well as information about grading, homework, the midterm, and the final exam.
Please do review the course policies. I count on participants to read and follow these policies. They are quite reasonable, and it is awkward to have to single out an individual for not following our few rule.
Feel free to contact me if you have questions about the suitability of the course for you. In general the course will only be appropriate if you have had a solid background in real analysis, preferably at the graduate level.