Some Nonlinear Optimization Methods , by Jos van Trier

Although most nonlinear optimization methods use linear approximations to find search directions in the model space, their search for the minimum is nonlinear: the nonlinear objective function is recalculated at each step in the search. These nonlinear methods can be divided in three classes: methods that only use function evaluations, methods that also use first derivatives or gradients, and methods that require function, gradient and second derivative information. I tested several methods on Rosenbrock's function, a test function often used in optimization literature.


« BACK

to SEP-51 index page

DOWNLOAD
pdf(608 KB)
ps.gz(853 KB)
STANFORD
EXPLORATION
PROJECT