Lecture Notes

16. Gram-Schmidt and the QR Factorization.

Part of the Series on Linear Algebra.

By Akshay Agrawal. Last updated Nov. 3, 2018.

Previous entry: Singular Value Decomposition ; Next entry: Projections

In this section, we will present a procedure for converting any list of linearly independent vectors into a list of orthonormal vectors that preserves spans. Applying this procedure to the columns of a matrix and encoding the correspondence between the two lists of vectors as the product of two matrices yields the QR factorization, one of the most important factorizations in applied linear algebra.

The Gram-Schmidt Procedure

Let be a list of linearly independent vectors. Define , and for , let

Then is orthonormal, and .

When applied to a list of linearly dependent vectors, one of the will equal . Gram-Schmidt can therefore be used to determine whether a list of vectors is linearly dependent.

The QR Factorization

The Reduced QR Factorization

Let be full rank, , and let denote the columns of . Running Gram-Schmidt on the columns of yields a list of vectors satisfying

where , for , and for . This relation can be expressed succintly as , where has columns and is an upper triangular matrix with entires . This decomposition is called the reduced QR factorization of .

With a slight modification to the Gram-Schmidt procedure, we can obtain a QR factorization for rank-deficient matrices , , . Run Gram-Schmidt on the columns of as before. However, whenenver (i.e., whenever is in ), do not form and instead continue to . In this fashion, we obtain with orthonormal columns and an upper triangular such that (the -th column of is taken to be the coordinates of with respect to the columns of ).

The Full QR Factorization

Let , , . The full QR factorization of is formed as follows: construct , as described in the previous section, and relabel them as , . Let be the columns of . Extend this list to an orthonormal basis of to obtain , and let be the matrix whose columns are . The full QR factorization of is written as

where is the block matrix of , and the block matrix of .

Here are some important properties of the QR factorization:

  1. .
  2. .
  3. is an orthogonal matrix.
  4. If is invertible, .
  5. If , , , then and the projection onto is . (See the notes on least squares for a definition of ).

Note. The QR factorization is typically only defined for tall-and-skinny matrices, i.e., matrices with more rows than columns. For a matrix , attempting to do a QR factorization would yield

The rightmost matrix (which has rows and columns) is not upper triangular.

References

  1. Sheldon Axler. Linear Algebra Done Right.
  2. Stephen Boyd and Sanjay Lall. EE 263 Course Notes.
  3. Lloyd Trefethen and David Bau. Numerical Linear Algebra.