** Next:** CGG with iteratively reweighted
** Up:** Conjugate Guided Gradient(CGG) method
** Previous:** Conjugate Guided Gradient(CGG) method

If we apply the same weight we used in the IRLS,
but do not change the operator from to ,the weight affects only the gradient direction.
This corresponds to guiding the gradient direction with a weighted residual,
and the resultant weighted gradient will be the same gradient as
we used in the IRLS method.
This algorithm can be implemented as follows:
`
iterate {
}.
`

Notice that the above algorithm is different from the original
CG algorithm only at the step of gradient computation;
the modification of the gradient is performed by changing the residual
before the gradient is computed from it.
By choosing the weight as a function of the residual
of the previous iteration step,
as we did in the IRLS, we can guide the gradient
to the gradient of the *L*^{p}-norm.
Thus the result obtained by weighting the residual
could be interpreted as an LS solution located along the direction
of the *L*^{p}-norm gradient, according to the weight applied.
If, during the iteration, any intermediate solution
is found at the minimum *L*^{2}-norm location in the model space,
it will be the final solution of the algorithm,
and it is the same as the solution of the conventional LS problem.
However, the minimum *L*^{2}-norm location is unlikely to fall
along the gradient of the different *L*^{p}-norm determined
by the applied weight.
Therefore, it is more likely that the solution will be close to
the minimum *L*^{p}-norm location determined by the applied weight.

** Next:** CGG with iteratively reweighted
** Up:** Conjugate Guided Gradient(CGG) method
** Previous:** Conjugate Guided Gradient(CGG) method
Stanford Exploration Project

5/23/2004