Exercise "matrix diagonalization"
Implement the Jacobi rotation method for (real symmetric) matrix diagonalization using the cyclic sweep algorithm (sequentially zero all off-diagonal elements row after row). Make some interesting examples proving your implementation works as intended.
In the Jacobi diagonalization method the rotation angle is given through
cot(2φ) = (Aqq-App)/(2Apq)
.
However, in the subsequent calculations it is not the angle itself but rather its sine and cosine that are used. A naive approach would need a calculation of an inverse cotangent function plus sine and cosine functions per each rotation. It is, however, possible to completely avoid calculations of expensive trigonometric functions. Indeed, using the trigonometric formula,
cot(2φ)=(1-tan2φ)/(2 tanφ) ,
one can find our tanφ as the smaller root of this quadratic equation,
tanφ = sign(cot2φ)(√(cot22φ+1)-|cot2φ|) .
If |cot2φ|>1 it is better to use another formula,
tanφ = cot2φ[√(1+1/cot22φ)-1] .
Now sinφ and cosφ can be calculated
as
cosφ = 1/√(1+tan2φ) ,
sinφ = cosφ tanφ .
Thus we have calculated our sinφ and
cosφ without calls to trigonometric functions, which
should be significantly faster (since the square-root function is
almost certainly implemented in hardware).
So, the exercise is: implement this optimized strategy in a new Jacobi diagonalization subroutine and find out what is the speed gain compared to your old naive implementation from item A. Some books claim the speed gain is significant, though I am a bit sceptical about that since the modern hardware should be rather good in calculating trigonometric functions. It may also depend on the programming language one uses. So, find out...
Implement inverse iteration algorithm with (Rayleigh quotient) shifts and make some interesting illustrations. You must use your own QR-factorization routines here (just for an extra check).
Find out how much faster it is to find one eigenvalue (and the corresponding eigenvector) by inverse iteration than by full diagonalization.