next up previous print clean
Next: Application of the CGG Up: Conjugate Guided Gradient(CGG) method Previous: CGG with iteratively reweighted

CGG with iteratively reweighted residual and gradient

In the previous two subsections, we examined the meaning the of weighting the residual and the gradient vector, respectively. Since applying the weighting in both residual space and model space is nothing but changing the direction of the descent for the solution search, the weighting is not limited either to residual or to model space. We can weight both the residual and the gradient,


		 		 $\bold r \quad\longleftarrow\quad\bold L \bold m - \bold d$ 
		 iterate { 
		 		 $\bold W_r \quad\longleftarrow\quad{\bf diag}[f(\bold r)]$ 
		 		 $\bold W_m \quad\longleftarrow\quad{\bf diag}[f(\bold m)]$ 
		 		  $\Delta\bold m \quad\longleftarrow\quad\bold W_m\bold L^T\bold W_r \bold r$ 
		 		  $\Delta\bold r\ \quad\longleftarrow\quad\bold L \ \Delta \bold m$ 
		 		  $(\bold m,\bold r) \quad\longleftarrow\quad{\rm cgstep}
 (\bold m,\bold r, \Delta\bold m,\Delta\bold r )$ 
		 		 } . 

Again, the above CGG algorithm is different from the conventional CG method only in the step of gradient computation. Whether we modify the gradient in the residual sense or in the model sense, it changes only the gradient direction, or the direction in which the solution is sought. Therefore the CGG algorithm always converges to a solution.


next up previous print clean
Next: Application of the CGG Up: Conjugate Guided Gradient(CGG) method Previous: CGG with iteratively reweighted
Stanford Exploration Project
5/23/2004