Lecture 17 Summary

« Previous: Lecture 16 Summary Next: Lecture 18 Summary »

Re-interpret Arnoldi and Lanczos as partial Hessenberg factorization. How that they correspond to AQn=QnHn+hn+1,nqn+1en* for Arnoldi and AQn=QnTn+hn+1,nqn+1en* for Lanczos, where Hn and Tn are upper Hessenberg and tridiagonal matrices respectively and en is the unit vector in the n-th direction.

Discussed how rounding problems cause a loss of orthogonality in Lanczos, leading to "ghost" eigenvalues where extremal eigenvalues re-appear. In Arnoldi, we explicitly store and orthogonalize against all qj vectors, but then another problem arises: this becomes more and more expensive as n increases. The solution to this is restarting schemes, where we go for n steps and then restart with the k "best" eigenvectors. For k=1 this is easy, but explained why it is nontrivial for k>1: we need to restart in such a way that maintains the Lanczos (or Arnoldi) property that the residual AQn - QnHn is nonzero only in the n-th column (and that column is orthogonal to Qn).

Discussed the implicitly restarted Lanczos method, which does n-k steps of shifted QR to reduce the problem from n to k dimensions. The key thing is that, because the Q matrices in QR on tridiagonal matrices are upper Hessenberg, their product can be shown to preserve the Lanczos property of the residual for the first k columns.