Class 14: Array Factorizations for Rectangular Matrices


Review from previous class

Factorization methods for square matrices The LU, Cholesky, and spectral decompositions all work with square matrices S, most often covariance matrices in statistics.

LU Factorization,S = LU
This factorization is a basic building block, supporting inverse, determinant and solve operations.

Cholesky, S = LL'
This factorization specializes the LU to p.s.d. matrices, leading to a way of finding the "square root" of a matrix. Its also tied closely to regression.

Spectral, S = E D E'
The decomposition of a matrix into a weighted sum of rank 1 matrices leads to numerous statistical applications, including principal components and an alternative definitiong of a matrix square root.


Status of Projects


Some thoughts on the comparison of S to LispStat

These email messages describe in a somewhat even handed way some of the differences between LispStat and S.

Decomposing Non-Square Matrices

QR decomposition
Rather than factor the matrix into triangular forms, the QR forms a Gram-Schmidt reduction into a triangular and orthogonal system. This is the first that we have seen that works with non-square matrices and a popular choice for regression calculations. The upper triangular matrix R of this decomposition is related to that from Cholesky.

Singular value decomposition, SVD
The SVD generalizes the notion of eigenvectors to non-square matrices. Some of the information in the SVD also reproduces the usual eigenvalue decomposition of the covariances --- but much is novel.

The SVD also leads to a very different way to do regression, known as total least squares or TLS, that treats the errors in the random variables symmetrically, unlike the usual OLS models.

Lisp script for today's class
class15.lsp


Next time

We will start looking at Splines and smoothing splines.