Exercise "Minimization"

  1. (6 points)

    Implement the Newton's minimization method with back-tracking line-search where the user provides the first and second derivatives (the gradient and the Hessian matrix) of the objective function.

    Find the minima of the Rosenbrock's valley function, f(x,y)=(1-x)2+100(y-x2)2.

    Find the minima of the Himmelblau's function, f(x,y)=(x2+y-11)2+(x+y2-7)2.

    Notice the number of steps it takes for the algorithm to reach the minimum.

  2. (3 points)

    Implement the quasi-Newton minimization method with Broyden's update SR1 update. Start with the unity inverse Hessian matrix and then apply the updates.

    Find the minima of the Rosenbrock valley function and the Himmelblau's function from part (A).

    Compare the effectiveness (let us say, the number of steps to reach the minimum) of

    Make some other interesting examples.

  3. (1 point)

    Implement the downhill simplex method.

  4. (0 points) Make up an interesting exercise yourself.