Next: The L-BFGS-B algorithm
Up: Optimization
Previous: Gradient Projection Algorithm
Trust region methods are widely popular for solving problems where
the Hessian has regions of negative curvatures Kelley (1999).
The basic idea behind these methods is to fit the objective function
locally with a quadratic model near a point
:
| ![\begin{displaymath}
m({\bf x}) = f({\bf x_k}) + {\bf g_k^T(x-x_k) + (x-x_k)^T B_k
(x-x_k)}/2,\end{displaymath}](img12.gif) |
(5) |
where
is the transpose and
is the Hessian
of
at iteration k. The goal is to find an
that minimizes
such that
| ![\begin{displaymath}
\Vert{\bf x-x_k}\Vert\leq \Delta,\end{displaymath}](img16.gif) |
(6) |
where
is the trust region radius. Once a local minimizer is
found, either the step is accepted, or the radius
is changed,
or both. Termination criterion end the process. In the most simple
case, an update of
is obtained by minimizing the projected steepest
descent onto the quadratic area
| ![\begin{displaymath}
\Phi(\lambda)=m({\bf x_k} - \lambda {\bf g_k}),\end{displaymath}](img18.gif) |
(7) |
with
This update is also called the Cauchy point.
This technique resembles quite closely the projection gradient
algorithm to find the active set
. It is then not surprising
that many algorithms for bound constrained optimization are
either completely or partially based on trust region methods.
The algorithm used in this paper does not involve any trust region radii,
but line search algorithms instead. Yet, this algorithm keeps the concept of
fitting the objective function locally with a quadratic form in
equation (5) to find the minimum of
in
equation (1).
Next: The L-BFGS-B algorithm
Up: Optimization
Previous: Gradient Projection Algorithm
Stanford Exploration Project
10/23/2004