Exercise "Minimization"
(6 points)
Implement the quasi-Newton minimization method with Broyden's update SR1 update. Start with the unity inverse Hessian matrix and then apply the updates.
Find the minima of the Rosenbrock's valley function,
f(x,y)=(1-x)2+100(y-x2)2.
Find the minima of the Himmelblau's function,
f(x,y)=(x2+y-11)2+(x+y2-7)2.
Compare the effectiveness (that is, the number of evaluations of the objective function) of i) your minimization method and ii) the root-finding Newton's methods from the root-finding exercise applied to the "equivalent" problem of finding the root of the gradient of the objective function.
Make some other interesting examples.
(3 points) Implement the downhill simplex method.
(1 point) Implement the downhill simplex method with simulated annealing.
(0 points)
Suppose you are already at the "end-game" parabolic local minimum. Try improve the rate of convergence of the downhill simplex method.