Hessenberg factorization of a real square matrix using Jacobi transformations
A lower Hessenberg matrix H is a square matrix that has zero elements above the first sup-diagonal, {Hij=0}j>i+1.
An upper Hessenberg matrix H is a square matrix that has zero elements below the first sub-diagonal, {Hij=0}i>j+1.
Hessenberg factorization is a representation of a square real matrix A in the form A=QHQT where Q is an orthogonal matrix and H is a Hessenberg matrix.
If A is a symmetrix matrix its Hessenberg factorization produces a tridiagonal matrix H.
If the constraints of a linear algebra problem do not allow a general matrix to be conveniently reduced to a triangular form, reduction to Hessenberg form is often the next best thing. Reduction of any matrix to a Hessenberg form can be achieved in a finite number of steps. Many linear algebra algorithms require significantly less computational effort when applied to Hessenberg matrices.
Consider a Jacobi transformation with the matrix J(p,q),
A→J(p,q)TAJ(p,q)
where the rotation angle is chosen not to zero the element Ap,q but rather to zero the element Ap-1,q. Argue that the following sequence of such Jacobi transformations,
J(2,3), J(2,4), ..., J(2,n); J(3,4), ..., J(3,n); ...; J(n-1,n)
reduces the matrix to the lower Hessenberg form.
Find out (by time measurements) which factorization is faster, Hessenberg or QR.
Check if QR-factorization of an upper Hessenberg matrix using Givens rotations can be done in O(n²) operations.
Check if Jacobi eigenvalue algorithm applied cleverly (eigenvalue-by-eigenvalue, and don't zero the elements that are already zero) to a tridiagonal matrix takes O(n) operations.