Class 16: Discussion of Array Factorizations
- Factorization methods for rectangular matrices
- The QR and SV decompositions generalize the Cholesky and spectral
decompositions to apply to retangular matrices X, such
as the design matrix of predictors in regression.
- LU Factorization,S = LU
-
This factorization is a basic building block, supporting inverse,
determinant and solve operations.
- Cholesky, S = LL'
-
This factorization specializes the LU to p.s.d. matrices, leading to
a way of finding the "square root" of a matrix. Its also tied closely
to regression.
- Spectral, S = E D E'
-
This orthogonal decomposition leads to principal components
and an alternative definition of a matrix square root.
- QR decomposition
- This is a Gram-Schmidt decomposition of a matrix, and leads to a
more stable method for solving ill-conditioned systems of equations, such
as those in regression with high collinearity.
- Singular value decomposition, SVD
- The SVD generalizes the notion of eigenvectors. The SVD leads to a
very different way to do regression, known as total least squares (TLS). TLS
is one approach to the problem of "errors in variables" in regression.
Status of Projects
- Movies are deadly to my browser, so please just put single images. Link
the movie to your page rather than force everyone to view it.
- Lisp script for today's class
-
Next time
We will start looking at Splines and smoothing splines.