Exercise "Root Finding"
Implement the Newton's method with simple backtracking linesearch algorithm where the derivatives in the Jacobian matrix are evaluated numerically.
The interface to the function could be something like
void newton(void f(vector x, vector fx), vector xstart, double dx, double epsilon);
where the user supplies the following parameters:
f: takes the input vector x, calculates
the vector f(x) and puts it into vector fx;
vector xstart: the starting point, upon exit it becomes
the latest approximation to the root;
double dx: the δx to be used in numerical
evaluation of the Jacobian matrix;
double epsilon: the accuracy goal.
Solve the system of equations
A x y = 1 ,
exp(-x) + exp(-y) = 1 + 1/A,
where A=10000.
Find the minimum of the Rosenbrock's valley function,
f(x,y) = (1-x)2+100(y-x2)2
by searching for the roots of its gradient.
Find the minimum of the Himmelblau's function
f(x,y) = (x2+y-11)2+(x+y2-7)2
by searching for the roots of its gradient.
Make some other interesting examples.
(3 points)
Implement the version of the Newton's method with back-tracking where the derivatives are supplied by the user.
Find out the effectiveness of this implementation (as compared with the previous implementation) on some interesting examples by counting the number of function calls.
(1 point)
Implement a more refined linear search with, say, quadratic interpolation.
Compare the number of function evaluations using the refined linear search and the simple backtracking.
(0 points)
Make up some interesting root-finding exercise.