next up previous print clean
Next: Extension to non-seabed peglegs Up: Regularization of the Least-Squares Previous: Model Regularization 3: Crosstalk-boosting

Combined Data and Model Residuals

To solve equation (3) for the optimal set of $\bold m_{i,k}$, we minimize a quadratic objective function, $Q(\bf m)$, which consists of the sum of the weighted $\ell_2$ norms of a data residual [equation (4)] and of three model residuals [equations (5), (6), and (7)].  
 \begin{displaymath}
\mbox{min} Q(\bold m) \; = \; \Vert \bold r_d \Vert^2 \; + \...
 ...]} \Vert^2 
 \; + \; \epsilon_3^2 \Vert \bold r_m^{[3]} \Vert^2\end{displaymath} (8)
$\epsilon_1, \epsilon_2,$ and $\epsilon_3$ are scalars which balance the relative weight of the three model residuals with the data residual. In practice I suggest setting $\epsilon_1=2.0$, $\epsilon_2=1.0$, and $\epsilon_3=2.0$. I use the conjugate gradient method to minimize $Q(\bold m)$; the method is well-suited for large-scale least-squares optimization problems like this one.
next up previous print clean
Next: Extension to non-seabed peglegs Up: Regularization of the Least-Squares Previous: Model Regularization 3: Crosstalk-boosting
Stanford Exploration Project
7/8/2003