Time Series Analysis


Announcements

Tips

Lecture notes
These lecture notes vary in detail from class to class and are intended to be an outline for what we will cover. The syllabus summarizes what's coming and readings from the course text.

I try to post a preliminary version of the notes the evening before that class; the version that I will use appears some time the morning of class. In addition, I will generally post an update to the notes after class to fix typos, add comments, and perhaps remove things that we did not get to (most likely, to a later set of notes). I put a time stamp on the latest version.

  1. Introduction ( R script file for this lecture.)
  2. Stationarity ( R script )
  3. Descriptive estimators ( R code )
    Read Appendix A (in the text) to fill in details that come with this lecture. The topics continue into Lecture 4.
  4. Properties of descriptive estimators ( R code )
  5. Regression models ( R code )
  6. Harmonic regression ( R code for variable star data )
  7. Eigenvectors of stationary processes These notes are a highly condensed version of this on-line manuscript .
  8. Autoregressive, moving average models (ARMA) ( R code for simulating ARMA models)
  9. Covariances of ARMA models
  10. Predicting ARMA processes
  11. Central limit theorems
  12. Estimating ARMA processes (basic script and more R code for estimating ARMA models)
  13. Resampling ARMA processes ( R code )
  14. State-space models
  15. Kalman filter ( R code )
  16. Hilbert spaces
  17. Spectral representation
  18. Discrete Fourier transform (14 Apr 2011, edits, a figure) ( R code )
  19. Spectral estimation (14 Apr 2011, edits, rearrange) ( R code )

Assignments
In general, you ought to read, and sketch an answer to all of the exercises at the end of chapters. I will pick out a few that seem most relevant, but that does not mean that you should ignore the others.

You will have about week to submit what you've done. You need not use R for any computing, but you will need to have access to some sort of software because some questions call for doing a bit of computing.

Solutions will be posted or handed out when assignments are returned, and then removed from the web page.

I've used the numbering from the second edition of the text for the exercises, and will note if there are changes in the 3rd.

  1. Due Feb 11. (comprises 2 assignments)
    Solution ( R code )
    Chapter 1: Exercises 1.8-10, 1.14, 1.15, 1.18, 1.19, 1.20, 1.24 (we do most of (b) in class, you think about the rest), and as best you can 1.28, 1.30, and 1.31. For the latter problems (and parts of the notes for Lecture 3 as well), look at Appendix A.
    (Exercise 1.24 is 1.25 in the 3rd edition; 1.28 is 1.29; 1.30 is 1.31; 1.31 is 1.32.)

    Chapter 2: Exercises 2.1, 2.4, 2.5, 2.9, 2.10
    Several of these consist only of data-analysis, so add graphs to show what you have done.
    (Exercise 2.9 is 2.11 in the 3rd edition, and 2.10 is 2.9 in the 3rd edition.)

    Nonlinear time series can exhibit characteristics that are quite different from linear processes. Simulate the SETAR model
    y[t] = 0.4266 + w[t] if y[t-1] < 0.1
    y[t] = 2.0372 - 2.7399 y[t-1] + w[t] otherwise.
    w[t] is Gaussian white noise with variance 25. Let the process run for t=1,...,100, then "cut off" the white noise (ie, set w[t]=0 for t = 101, 102,..., 150.)
    What happens? How does the behavior of this process differ from that of a general linear process when the noise "cuts off"? (This process comes from an application to lemming counts in Norway.)

  2. Due March 4.
    Solution ( R code )
    Chapter 3: Exercises 3.3, 3.7, 3.8, 3.9, 3.10, 3.11, 3.12, 3.17, 3.19, 3.31
    (3rd Edition: 3.4, 3.8, 3.9, 3.10, 3.11, 3.12, 3.13, 3.18, 3.20, 3.33)
    For 3.19 (3.20 in 3rd ed), you don't need to turn in the simulation; just explain the nature of the process that was generated and the estimates from the ARMA(1,1) model.
    For 3.31 (3.33 in 3rd), difference the temperature data and model the differences as an ARMA process. For estimating an ARMA model, use the R function "arima". (Use ?arima to see the on-line help.)

    Add the following two parts to question 3.9 (3.10 in 3rd):
    (c) Are the coverage properties of the 4 prediction intervals independent? That is, are the four 0/1 random variables that indicate whether the intervals cover the future values independent? Explain your answer briefly.
    (d) The question requires 95% prediction intervals at each lead. How can you get 95% coverage over all 4 weeks?

    Hint. Exercise 3.11 (3.12 in 3rd ed) is harder than its length suggests. Here's one approach. Suppose that Gamma_n is singular; then there is an AR(k) process (for some k less than n) that fits perfectly. The exercise does not state it, but the process is assumed stationary. Stationarity means that this recursion applies at every point in time. Show using backsubstitution that this model predicts X_n perfectly from X_1,...,X_k. Find the covariances implied by this relationship, a la the Yule-Walker equations. These lead to a contradiction to the condition that gamma(0)>0. (I'll let you 'assume' that the coefficients of the process are bounded, but you can prove this as well if you are on a roll!)

  3. The final assignment! Due April 12.
    Solution Chapter 6: Exercises 6.1, 6.2, 6.3, 6.5, 6.6, 6.13 (3rd Edition: The same!)
    Plus a one-page (roughly) description of what you are doing for your class project.