Some Nonlinear Optimization Methods
, by Jos van Trier
Although most nonlinear optimization methods use linear approximations to find
search directions in the model space, their search for the minimum
is nonlinear: the nonlinear objective function is recalculated at
each step in the search.
These nonlinear methods can be divided in three classes: methods that only
use function evaluations, methods that also use first derivatives or
methods that require function, gradient and second derivative information.
I tested several methods on Rosenbrock's function, a test function
often used in optimization literature.