Least-squares imaging and deconvolution using the hybrid norm conjugate-direction solver |

where

Where is called Hessian operator, and is the pseudo inverse of

One disadvantage of this data-space inversion scheme is that it can not be computed in a target-oriented way, since theoretically even a local perturbation in the model space will affect the entire data space and vice versa. To overcome this difficulty, Valenciano (2006) transformed (1) to a model space inversion based on (2):

Valenciano (2008) and Tang (2008) showed that unlike **L**,
matrix **H** is usually very sparse (i.e., most of the non-zero elements
are centered around the diagonal); thus despite the huge size of
**H**, it is feasible to store an approximation of **H** matrix by keeping only a
few off-diagonal elements without losing much accuracy.

If we write
, and add a model
regularization term (since most likely ** H** has a null space). Then the inversion formula is

in which we applied the hybrid norm to the regularization term.

Tang (2009) provided a way to efficiently compute the Hessian matrix using the phase-encoding technique, and this Hessian matrix is computed only once and stored for all iterations.

Least-squares imaging and deconvolution using the hybrid norm conjugate-direction solver |

2010-05-19