next up previous print clean
Next: The L-BFGS-B algorithm Up: Optimization Previous: Gradient Projection Algorithm

Trust region methods

Trust region methods are widely popular for solving problems where the Hessian has regions of negative curvatures Kelley (1999). The basic idea behind these methods is to fit the objective function locally with a quadratic model near a point ${\bf x_k}$: 
 \begin{displaymath}
m({\bf x}) = f({\bf x_k}) + {\bf g_k^T(x-x_k) + (x-x_k)^T B_k
 (x-x_k)}/2,\end{displaymath} (5)
where $(){\bf ^T}$ is the transpose and ${\bf B_k}$ is the Hessian of $f({\bf
 x})$ at iteration k. The goal is to find an ${\bf x}$ that minimizes $m({\bf x})$ such that
\begin{displaymath}
\Vert{\bf x-x_k}\Vert\leq \Delta,\end{displaymath} (6)
where $\Delta$ is the trust region radius. Once a local minimizer is found, either the step is accepted, or the radius $\Delta$ is changed, or both. Termination criterion end the process. In the most simple case, an update of ${\bf x_k}$ is obtained by minimizing the projected steepest descent onto the quadratic area  
 \begin{displaymath}
\Phi(\lambda)=m({\bf x_k} - \lambda {\bf g_k}),\end{displaymath} (7)
with
\begin{displaymath}
\Vert{\bf x_k} - \lambda {\bf g_k}\Vert\leq \Delta. \nonumber\end{displaymath}   
This update is also called the Cauchy point. This technique resembles quite closely the projection gradient algorithm to find the active set $A({\bf x})$. It is then not surprising that many algorithms for bound constrained optimization are either completely or partially based on trust region methods.

The algorithm used in this paper does not involve any trust region radii, but line search algorithms instead. Yet, this algorithm keeps the concept of fitting the objective function locally with a quadratic form in equation (5) to find the minimum of $f({\bf
 x})$ in equation (1).


next up previous print clean
Next: The L-BFGS-B algorithm Up: Optimization Previous: Gradient Projection Algorithm
Stanford Exploration Project
10/23/2004