Solving least squares equations - the QR decomposition.

Preliminaries.

Write the squared length of a vector x as tex2html_wrap_inline23 .

The least squares estimator tex2html_wrap_inline25 solves the problem:

displaymath27

Which linear combination of the covariates is ``closest'' to the observed data, Y?

Define the squared length of the residual vector as the residual sum of squares (RSS), and tex2html_wrap_inline31 as RSS/(n-p).

Define an tex2html_wrap_inline35 matrix to be orthogonal if Q satisfies (for any vector)

displaymath37

Orthogonal transformations preserve length.

We can do a transformation on a regression problem:

displaymath39

becomes

displaymath41

Due to the length preservation of Q and the fact that we are doing a least squares problem means that a solution to one is a solution to the other.

Can we choose Q to make the problem simpler and more stable?

Choose Q so that the upper tex2html_wrap_inline45 block of tex2html_wrap_inline47 is upper triangular and the lower tex2html_wrap_inline49 block is all zeroes.

Now

displaymath51

and only the first part depends on tex2html_wrap_inline53 , giving

displaymath55

which can be solved recursively without inversion.


next up previous
Up: Class 4 Stat 540 Previous: Class 4 Stat 540

Richard Waterman
Fri Feb 5 08:18:39 EST 1999