next up previous print clean
Next: Combined Data and Model Up: Regularization of the Least-Squares Previous: Model Regularization 2: Differencing

Model Regularization 3: Crosstalk-boosting weighting

NMO (for primaries or multiples) flattens events of a single order, but leaves events of other orders (crosstalk) remaining with residual curvature. To compute (for instance) the crosstalk model for first-order pegleg multiples on the primary model panel, $\bold m_0$, the following steps are followed.

Figure [*] illustrates the application of the crosstalk weight for first and second-order peglegs on $\bold m_0$ applied to a NMOed synthetic CMP gather. Notice how the multiples are ``picked'' cleanly out of the data, while strong primaries are left largely intact.

 
crosstalk.hask
Figure 2
Synthetic CMP gather with and without crosstalk weights for k=0 and j=1,2,3 applied.
crosstalk.hask
view burn build edit restore

Denoting the crosstalk weights for each $\bold m_{i,k}$ as a vector $\bold w_{i,k}$, we can write the model residual corresponding to the third model regularization operator:  
 \begin{displaymath}
\bold r_m^{[3]}(\tau,x,i,k) = w_{i,k}(\tau,x) \; m_{i,k}(\tau,x).\end{displaymath} (7)
Although the crosstalk weights will likely overlap some primaries, the primaries' flatness ensures that regularization operators (5) and (6) ``spread'' redundant information about the primaries from other $\bold m_{i,k}$ and other offsets to compensate for any losses.


next up previous print clean
Next: Combined Data and Model Up: Regularization of the Least-Squares Previous: Model Regularization 2: Differencing
Stanford Exploration Project
7/8/2003